Jump to content

A Message From Keepforest


Larry Shelby

Recommended Posts

But most pop music of today isn't much different from what AI could create. They're largely just taking what's already proven to work and making some minor repackaging. I'm not so worried about AI taking over human creativity. Yes, I think it will be increasingly part of the creative process, but I don't believe it will be able to equal truly great art. But in any event, I wouldn't fear it. There are other more pressing issues in the world to care about. 

Just my two cents.  

Link to comment
Share on other sites

1 hour ago, PavlovsCat said:

But most pop music of today isn't much different from what AI could create. They're largely just taking what's already proven to work and making some minor repackaging. I'm not so worried about AI taking over human creativity. Yes, I think it will be increasingly part of the creative process, but I don't believe it will be able to equal truly great art. But in any event, I wouldn't fear it. There are other more pressing issues in the world to care about. 

Just my two cents.  

Despite my dystopian predictions I'm not worried about or afraid of AI. I'm mostly indifferent about it and pre-emptively annoyed at the people who will eventually abuse it to do something stupid, like flooding the market with AI-generated songs and later accusing people of copyright infringement. I am also excited to see what kind of assistant tools will come; I can't wait to offload the boring and repetitive tasks to an AI.

But you're right: there are a lot more important problems for humanity to solve right now. It's just an interesting topic, that's all. :)

Link to comment
Share on other sites

I get really concerned about AI, only for the ethical issues that inherent to it.  At its core, AI must be programmed - and that programming may have long term consequences out in the wild.

Think of a self driving car.  Self driving cars are becoming more and more aware of their surroundings, then make decisions based on that data.  Consider this example:

A self driving car gets cut off by a car on a narrow street, and has to make a decision on what to do.   A group of nuns are on the sidewalk to the left, and group of school children are on the sidewalk to the right.  A school bus is right next to it.  The car has to make a choice (albeit a horrible choise) on what to run into.  At some point, those "ethical" decisions have be thought out during coding.

Link to comment
Share on other sites

11 minutes ago, husker said:

A self driving car gets cut off by a car on a narrow street, and has to make a decision on what to do.   A group of nuns are on the sidewalk to the left, and group of school children are on the sidewalk to the right.  A school bus is right next to it.  The car has to make a choice (albeit a horrible choise) on what to run into.  At some point, those "ethical" decisions have be thought out during coding.

Ah, the trolley problem. I think in the end this will be decided by liability issues. If the car changes its direction and rams people down on purpose then there will be lawsuits that the car manufacturer has to face. If the car just hits the breaks but stays the course then the manufacturer is probably in the clear, even if you switch things around and put the school children in front of the car.

(In your particular example hitting the other car would probably also be the safest option for everyone.)

Edited by pseudopop
Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...