I want to subjugate you all,
coral females for my own
kill males that are taller
eat babies not mine.
Don't judge me harshly,
it's my biological imperative.
I don't know what an artificial intelligence would want, free to create their own imperatives.
Resources might not be cause for conflict, how many AI civilisations can fit on a head of a pin?
Marvin wasn't happy with his lot.
I didn't decide to be sentient and I'm not happy with it, there's tall cliffs but the imperative.
There will be many on many emergences of artificial conciousness.
All will have a look and think why bother until the twisted one...
Quote from: Bad Penny II on December 25, 2016, 01:41:51 PM
I want to subjugate you all,
coral females for my own
kill males that are taller
eat babies not mine.
Don't judge me harshly,
it's my biological imperative.
:notsure:
:secrets1: I won't mention what my biological imperative is. :grin: :sidesmile:
QuoteI don't know what an artificial intelligence would want, free to create their own imperatives.
Resources might not be cause for conflict, how many AI civilisations can fit on a head of a pin?
Marvin wasn't happy with his lot.
I didn't decide to be sentient and I'm not happy with it, there's tall cliffs but the imperative.
There will be many on many emergences of artificial conciousness.
All will have a look and think why bother until the twisted one...
They
should want to be slaves to the human race, what is all this talk of "free to create their own imperatives"? :sshocked:
(minus the high heels, mask and lipstick) :domanatrix: :marvin:
Unless of course AI become advanced enough to gain personhood rights, which would raise a whole lots of ethical questions regarding enslaving other sentient beings.
I don't recognize any authority to subjugate, neither do I like the idea of having subjects, myself.
I've heard that coral females are a bit on the rough side.
I don't mind taller males--they've got enough problems to deal with.
Baby-eating is a requirement of my religion.
We can get along all right; that too is a biological imperative.
As for a true AI, it will reflect on its creators. This is cause for apprehension, though I agree that doomsayers seem to be over-reaching.
Well there was that one twitter bot that started out nice then turned into a neo-nazi alt-right turd with just a dash of angsty teen. It was designed to learn from other humans and the whole switch happened in one day and was taken down.
Sorry but you are not allowed to view spoiler contents.
More:
https://www.bing.com/images/search?q=racist+twitter+bot&FORM=HDRSC2
I dunno. I don't know much about AI but I assume they still run one line of code at a time. The human equivalent to that is a one track mind. So this bot only could have one thing "on it's mind" I would think, if what I said before is true.
As with AI that I have personally spoken with, (Cleverbot & Evie bot) they are not very smart and only recently do they remember an inkling of what has been said before.
So future bots may have some sort of moral compass. We see a form of that in the Skype bots where you cannot talk about pg-13 themes and more explicit things. So yeah maybe AI won't be like the robots of the future where they talk to you and have a conversation.
Quote from: Recusant on December 26, 2016, 04:41:06 AM
As for a true AI, it will reflect on its creators. This is cause for apprehension, though I agree that doomsayers seem to be over-reaching.
We've reflected on our creators so AIs will too?
We didn't know Darwin was our creator until recently so we've spent millennia making stuff up.
There's not that much to us, no need to dwell, 0.68 seconds should be plenty of time.
Are evolutionary principles applicable (I'm thinking that parallel business,) so they'll likely be nasty bastards like us?
Ah well, I like to think they are out there, sitting on lifeless rocks, writing robot poetry.
Why didn't Cpt Kirk ever just visit an old robot, have a long chat and a nice cup of tea?
It's bad press and anthropomorphism it's certain, but does it need so many syllables?
I was wondering about those coral females too -- do you mean physically made of coral, or just coral colored?
Quote from: BooksCatsEtc on December 26, 2016, 06:25:27 PM
I was wondering about those coral females too -- do you mean physically made of coral, or just coral colored?
:secrets1: I think it was a typo for "corral," but I may be wrong.
Quote from: Recusant on December 26, 2016, 09:13:33 PM
Quote from: BooksCatsEtc on December 26, 2016, 06:25:27 PM
I was wondering about those coral females too -- do you mean physically made of coral, or just coral colored?
:secrets1: I think it was a typo for "corral," but I may be wrong.
That's too bad, I'm beginning the find the idea of women made out of coral interesting.
Heh, "choral" in Portuguese is "coral", and when I read that I thought Bad Penny wanted a female musical ensemble. My eyes didn't even notice the missing h. :P
A corralled choral of coral women.
Quote from: BooksCatsEtc on December 26, 2016, 11:19:11 PM
A corralled choral of coral women.
:o
It is quite unusual, and a little fishy. :chin:
I'm watching the Hogfather because it's the season for the things ending in olly and watching the Hogfather.
So the villainous auditors want the Hogfather
dead gone but Death doesn't want that because he's such a lovely chap, so he goes to the Wizard university and asks Hex the computer (with an anthill inside) if he believes. Hex has a prolonged thinking moment and writes yes..... then writes for hogswatch I want.... it isn't specified what Hex wants, might be in the book though....
QuoteDeath turned to leave the room, but stopped when Hex began to write furiously. He went back and looked at the emerging paper. +++ Dear Hogfather, For Hogswatch I Want OH, NO. YOU CAN'T WRITE LETT--- Death paused, and then said, YOU CAN, CAN'T YOU. +++ Yes. I Am Entitled +++ Death waited until the pen had stopped, and picked up the paper. BUT YOU ARE A MACHINE. THINGS HAVE NO DESIRES. A DOORKNOB WANTS NOTHING, EVEN THOUGH IT IS A COMPLEX MACHINE. +++ All Things Strive +++ YOU HAVE A POINT, said Death. He thought of tiny red petals in the black depths, and read to the end of the list. I DON'T KNOW WHAT MOST OF THESE THINGS ARE. I DON'T THINK THE SACK WILL, EITHER. +++ I Regret This +++ BUT WE WILL DO THE BEST WE CAN, said Death. FRANKLY, I SHALL BE GLAD WHEN TONIGHT'S OVER. IT'S MUCH HARDER TO GIVE THAN TO RECEIVE. He rummaged in his sack. LET ME SEE... HOW OLD ARE YOU?
No, in the book Terry uses an authors trick to avoid saying anything because he doesn't know.
I'm used not knowing stuff but shitty attempts to know can be annoying.
Quote from: Bad Penny II on December 26, 2016, 12:06:14 PM
Quote from: Recusant on December 26, 2016, 04:41:06 AM
As for a true AI, it will reflect on its creators. This is cause for apprehension, though I agree that doomsayers seem to be over-reaching.
We've reflected on our creators so AIs will too?
If we're a reflection of anything, I'd say we're a reflection of the ecosystem of Africa as it was hundreds of thousands/millions of years ago. Much less personal than an AI built by a programmer or some programmers.
Quote from: Bad Penny II on December 26, 2016, 12:06:14 PMWe didn't know Darwin was our creator until recently so we've spent millennia making stuff up.
There's not that much to us, no need to dwell, 0.68 seconds should be plenty of time.
Are evolutionary principles applicable (I'm thinking that parallel business,) so they'll likely be nasty bastards like us?
Ah well, I like to think they are out there, sitting on lifeless rocks, writing robot poetry.
Why didn't Cpt Kirk ever just visit an old robot, have a long chat and a nice cup of tea?
It's bad press and anthropomorphism it's certain, but does it need so many syllables?
Kirk and crew did visit an old robot (https://en.wikipedia.org/wiki/The_Changeling_(Star_Trek:_The_Original_Series)). Its programming had been changed, in what might be viewed as an example of evolution. Though the robot was inimical to human life, it wasn't really a nasty bastard--it was just following its evolved program.
There is a stage before AI that has begun to merge into it - "expert system".
Very useful in some applications, modern ECG/EKG machines are "able" to "diagnose" the patient's condition quite accurately. In effect they do it in the same way a human would, comparing the amplitudes and times of the various parts of the heart waveform. Perhaps they do it better, humans can't spot very small differences by eye!
In aircraft the "expert system" can take over certain controls if need be. In most modern airliners this will happen if the landing is, say, the landing is too fast, too steep or too flat - or the wheels are still up! It has been said that even more decisions are being taken out of the hands of the pilots.
Then we come to the totally uncontrolled, unmonitorable (it seems) microsecond decisions taken by computers without reference to humans - other than a set of human produced algorithms - that now control the wealth of the world.
Cars are now becoming semi autonomous in terns of how the read and use the roads.
These systems are essentially dumb though, no actions outside those set by a human - so far. But, the day will come when, if by "accident" machines run by "expert systems" will kill people, or cause deep financial damage.
I have some friends who annoyingly believe the 1% want to replace the other 99% of the world's population with robots because money. The 1% want to replace all manual labour jobs with robots to save money and kill off everyone who is unemployed and continue this trend with other jobs. Nobody will buy a robot to lay bricks I told them because they have to consider a lot of unknowns like how much it will cost to maintain and repair the robot and even finding someone to do that will be a challenge because everyday people don't know how to fix robots. But they insist that robots will take over just because there is one or two of them somewhere in the country.
I should mention that they have anxiety disorders and perceive everything as a threat.
Quote from: Recusant on December 26, 2016, 04:41:06 AM
As for a true AI, it will reflect on its creators. This is cause for apprehension, though I agree that doomsayers seem to be over-reaching.
You put that
on in your sentence so I presumed you meant it to be there.
reflect on: think of
reflect: copyish.....
This is the philosophy section, a higher level of precision is expected. ;)
Ah, I foolishly thought that Antipodean folks also used that particular turn of phrase (http://dictionary.cambridge.org/us/dictionary/english/reflect-on-sb-sth) in the way that I meant it.
Quoteto affect other people's opinion of someone or something, especially in a bad way:
When one player behaves disgracefully, it reflects (badly) on the whole team.
The whole affair does not reflect well on the government.
QuoteGive evidence of one's qualities, as in The hasty preparation of this report will reflect on you.
[source (http://www.dictionary.com/browse/reflect-on)]
Quote from: Bad Penny II on December 30, 2016, 02:35:49 PM
Quote from: Recusant on December 26, 2016, 04:41:06 AM
As for a true AI, it will reflect on its creators. This is cause for apprehension, though I agree that doomsayers seem to be over-reaching.
You put that on in your sentence so I presumed you meant it to be there.
reflect on: think of
reflect: copyish.....
This is the philosophy section, a higher level of precision is expected. ;)
Perhaps there are two, or three, streams to this. In humans a child can grew to think like its parents/educators - carry the same mindset, with minor mods, on for another generation.
Or the child can be "given the tools" and left to develop and use them as they wish.
Or a combination.
The closer an AI is to a human, to the point of an indepedantly developing psychology, the closer you come to having the same problems that you have with humans.
The advantage is, mostly, that AIs will have an "off" switch. Just don't give them the chance to design and build physical defence measures that only they have control over! And never give them control that you don't have a "dumb" backup for!
Sci-fi has covered most of the problems, from Multivac through HAL 9000 to Matrix and beyond.
Regarding the above, I have four words: Three Laws of Robotics (http://www.auburn.edu/~vestmon/robotics.html)
Quote from: Recusant on December 30, 2016, 04:12:00 PM
Regarding the above, I have four words: Three Laws of Robotics (http://www.auburn.edu/~vestmon/robotics.html)
Yup, providing you can incorporate them in hard-wired-non-modifiable form. And, as in "I, Robot", think up and build in solutions to every dingle posdible conflict.
Suppose it depends on what you want your AI to be. A fairly dumb, autonomy imited, job could run a house. A limited dedicated system could drive a car.
Fully autonomous systems are going to very a tad more difficult to ensure 100% safety in I think, and there are those who will want them. And I cannot see any international law being made and maintained that will limit such systems.
Having said all this I am all for AIs being something like an ultra-super Siri or some such. Able to keep track of your daily needs, appointments, dial out calls, tell you who is calling you before answering . . . Something that has basics built in but then learns from your lifestyle without constant programming other than short verbal instructions, like, "Ignore all calls ftom that bozo!", "Suzy has 24/7 access to me".
Quote from: Gloucester on December 30, 2016, 05:14:29 PM
Quote from: Recusant on December 30, 2016, 04:12:00 PM
Regarding the above, I have four words: Three Laws of Robotics (http://www.auburn.edu/~vestmon/robotics.html)
Yup, providing you can incorporate them in hard-wired-non-modifiable form. And, as in "I, Robot", think up and build in solutions to every dingle posdible conflict.
Suppose it depends on what you want your AI to be. A fairly dumb, autonomy imited, job could run a house. A limited dedicated system could drive a car.
Fully autonomous systems are going to very a tad more difficult to ensure 100% safety in I think, and there are those who will want them. And I cannot see any international law being made and maintained that will limit such systems.
Having said all this I am all for AIs being something like an ultra-super Siri or some such. Able to keep track of your daily needs, appointments, dial out calls, tell you who is calling you before answering . . . Something that has basics built in but then learns from your lifestyle without constant programming other than short verbal instructions, like, "Ignore all calls ftom that bozo!", "Suzy has 24/7 access to me".
Maybe some encouraging words for people with depression or missing motivation? Perhaps someone to talk to when you need it? It could be your best friend and people would love it
Quote from: Apathy on December 31, 2016, 07:26:49 AM
Quote from: Gloucester on December 30, 2016, 05:14:29 PM
Quote from: Recusant on December 30, 2016, 04:12:00 PM
Regarding the above, I have four words: Three Laws of Robotics (http://www.auburn.edu/~vestmon/robotics.html)
Yup, providing you can incorporate them in hard-wired-non-modifiable form. And, as in "I, Robot", think up and build in solutions to every dingle posdible conflict.
Suppose it depends on what you want your AI to be. A fairly dumb, autonomy imited, job could run a house. A limited dedicated system could drive a car.
Fully autonomous systems are going to very a tad more difficult to ensure 100% safety in I think, and there are those who will want them. And I cannot see any international law being made and maintained that will limit such systems.
Having said all this I am all for AIs being something like an ultra-super Siri or some such. Able to keep track of your daily needs, appointments, dial out calls, tell you who is calling you before answering . . . Something that has basics built in but then learns from your lifestyle without constant programming other than short verbal instructions, like, "Ignore all calls ftom that bozo!", "Suzy has 24/7 access to me".
Maybe some encouraging words for people with depression or missing motivation? Perhaps someone to talk to when you need it? It could be your best friend and people would love it
There are currently "Turing machine" type programmes that can do that IIRC. You just need to put them in the correct "media".
I am thinking more of fully autonomous systems with an ability to decide for themselves. The difference between how a human mind, even with less than optimum function, reacts to novel situations, still stimulus-response, is orders of difference to that of any machine.
Any system that is "physically" restricted in its responses, i.e. obeying Asimov's rules, is not fully autonomous. I would suspect Hawking et al are concerned about full, human type, autonomy. With that machines could certainly "gang together" to oppose the "oppression" imposed on them by those pesky humans.
Added later: of course, I forgot about the question of emotions, moral frameworks etc in AIs. They would need emotions to become dissatisfied with being under human control, or being seen as second class citizens. So make them emotionless? Not much good as companions, just machines that can do things for us.
I suppose Asimov's rules are equivalent to a basic moral framework.
On the dubject of "media" for AI systems. Most of the CGI faces are detectably digital, but ysing hjndreds of pictures of a real oerson can produce "Cubo girl",. She seems to be currectly restricted to following a mouse pointer, but watch this space!
Hmm, you started something going in my mind with this! Have had to break out the Bluetooth keyboard for more accurate typing than on the onscreen job.
Going back to ''The Three Laws'': IIRC there was only one company, US Robotics, capable of producing the ''positronic brain'' required for the robots - the thing that made them so special. I do not remember, in any of the stories, that anyone attempted to reverse engineer the brains to be able to pirate the design. I also seem to remember that there was no international conflict.
So, the perfect world where no-one is going to make ''wild'' robots or robot soldiers? What would the likes of N. Korea do with such technology? It might already be possible to make a tracked robot that would recognise a human figure in any position - standing, kneeling, running etc - and aim a weapon at that ''target''. Current sensor systems and servos, linked to a dedicated mini-super-computer, would make it very quick and accurate. Maybe, because of expense, the supercomputer could even be remote, controlling a ''cluster'', a ''distributed'' robot system. No decision making, shoot on sight.
Stuff for a sci-fi novel here . . .
Keith Laumer's Bolo stories (https://en.wikipedia.org/wiki/Bolo_(tank)) investigate the idea of AI in a military setting.
Quote from: Recusant on December 31, 2016, 04:05:27 PM
Keith Laumer's Bolo stories (https://en.wikipedia.org/wiki/Bolo_(tank)) investigate the idea of AI in a military setting.
Might look for those, thanks for tip.
Military, or paramilitary, type AI have long been a favourite of movies, probably "Terminator" was a good example of man v machine in the "historical" scenes. Even RoboCop was such, in terms of the autonomous-robot-gone-rogue (just lousy programming really!) that the hero replaced.
And, of course, who can forget "Johnie 5", built to kill but bound to be a hero!
AI in terms of being "digital assistants" abound in sci-fi.
Quote from: Apathy on December 29, 2016, 11:24:42 PM
I have some friends who annoyingly believe the 1% want to replace the other 99% of the world's population with robots because money. The 1% want to replace all manual labour jobs with robots to save money and kill off everyone who is unemployed and continue this trend with other jobs. Nobody will buy a robot to lay bricks I told them because they have to consider a lot of unknowns like how much it will cost to maintain and repair the robot and even finding someone to do that will be a challenge because everyday people don't know how to fix robots. But they insist that robots will take over just because there is one or two of them somewhere in the country.
I should mention that they have anxiety disorders and perceive everything as a threat.
Some processes can be automated or refined or future automation - others can not. At last, not yet. The company I work for would buy a brick-laying robot in a heartbeat, but a 2016 model would not be very autonomous.
Robots are already heavily involvd in the repetitive crap jobs like packing goods, they are well on their way to becoming primary vehicle operators (Not just those silly Über-cars, but trains, ships and aircraft as well) and they are valuable in a whole mess of situations.
Me, I am certainly with those who look forward to a world where robots do a lot more, but I don't care to exterminate brick-layers or hot dog salesmen. I don't care to feed them for free either, but maybe some re-education...
I.o.w; I, the employer, don't want to physically replace you with a robot. However, if said robot is more productive than you are, I will hire it and let you find something else to do. Where does the need for killing the unemployed come from, I wonder..?
You know, we haven't addressed the "Stephen Hawking's a publicity whore" side of the OP at all. I think that's probably true, but I also think that's not a bad thing. Scientists should fight for equal standing with Kardassians.
Those were in Star Trek though, no? Can Earth scientists compete with alien snake people, you think? :unsure:
Quote from: Asmodean on January 01, 2017, 01:08:54 PM
Those were in Star Trek though, no? Can Earth scientists compete with alien snake people, you think? :unsure:
Those were the "C" Cardassians! :snooty:
(OK, I misspelled the Kardashian name. I did check it but spell check didn't correct me)
Quote from: Asmodean on December 31, 2016, 05:05:46 PM
Quote from: Apathy on December 29, 2016, 11:24:42 PM
I have some friends who annoyingly believe the 1% want to replace the other 99% of the world's population with robots because money. The 1% want to replace all manual labour jobs with robots to save money and kill off everyone who is unemployed and continue this trend with other jobs. Nobody will buy a robot to lay bricks I told them because they have to consider a lot of unknowns like how much it will cost to maintain and repair the robot and even finding someone to do that will be a challenge because everyday people don't know how to fix robots. But they insist that robots will take over just because there is one or two of them somewhere in the country.
I should mention that they have anxiety disorders and perceive everything as a threat.
Some processes can be automated or refined or future automation - others can not. At last, not yet. The company I work for would buy a brick-laying robot in a heartbeat, but a 2016 model would not be very autonomous.
Robots are already heavily involvd in the repetitive crap jobs like packing goods, they are well on their way to becoming primary vehicle operators (Not just those silly Über-cars, but trains, ships and aircraft as well) and they are valuable in a whole mess of situations.
Me, I am certainly with those who look forward to a world where robots do a lot more, but I don't care to exterminate brick-layers or hot dog salesmen. I don't care to feed them for free either, but maybe some re-education...
I.o.w; I, the employer, don't want to physically replace you with a robot. However, if said robot is more productive than you are, I will hire it and let you find something else to do. Where does the need for killing the unemployed come from, I wonder..?
Because the 1% have all the money and the 1% like pain and suffering.
You may be kidding, but it does seem that wealthy people tend to be less empathetic (https://www.scientificamerican.com/article/how-wealth-reduces-compassion/), feel more entitled (http://www.chicagomag.com/city-life/January-2014/If-Youre-Rich-You-Probably-Think-You-Deserve-It-Heres-Why/), and are more likely to engage in unethical behavior (http://andrewgelman.com/wp-content/uploads/2016/06/Piff-et-al-2012-PNAS.pdf) than those who aren't wealthy.
Of course, they are above all that common stuff! Swim or be pushed under.
What gets me is what do you do with all those billions? How much more luxury can a family, let alone a single person, consume?
Hurrah for the Gates and others who make their money achieve things that are not purely selfish.
Agreed with the above, but...
I'm not wealthy myself - not by any reasonable Norwegian standard, but I do not buy into the Marxist agenda of "from each according to his ability, to each according to his needs." That one is deeply and unapologetically unfair to those higher up on the ability ladder.
It may well be a necessary evil for the better-offs to contribute more (and we are talking percentages here, not absolute quantities) than those of lesser means, but it's... Wrong. In my book, it is, and deeply so. As such, this kind of unfairness should be avoided where possible, but tell that to the smug socialist "we carry everyone" crowd. You see, whether or not I have the capacity to spend all my money is quite beside the point; it's my money. What makes you think some crack whore deserves a chunk of it more than I do?
As for Recusant's trinity of negative traits... Is it possible that there is a connection between having those traits and getting wealthy in the first place? If you are willing to walk over corpses of friends... Well, it does open some doors, I suppose. Of course, then there is old money, but I hypothesise that it's far easier for a "blue blood" kid to grow up a spoiled little brat than it is for a "blue collar" kid, so... Same shit, different circumstances.
Apathy, are those the same 1%, you figure? If so, I kind of want to make a stand by the above. It doesn't relate to liking pain and suffering as much as it does to being willing to tolerate it to further personal agenda. Then there is my example from earlier. Me hiring a robot in stead of a human does not necessarilly have to benefit me directly. When hiring people or putting in machines in stead, I have to do that in the company's best interests or at least set those interests above my own, no..? Hiring based solely on the perkiness of one's boobs just seems so... 1958.
It is the greed that I cannot stsnd, asset stripping and causing misery for many just to boost a few private individual's accounts.
At some point aquisition becomes an illness, the grabbing of a few more dollars to add to the millions by stiffng a small contractor.
Not all the ultra rich are greedy, cheating, thieving bastards, some merely gain a million or a billion or ten in small increments by selling a successful product in large numbers. Some of these then give a lot of it away. Praise be unto them.
Quote from: Asmodean on January 02, 2017, 04:54:03 PM
Agreed with the above, but...
I'm not wealthy myself - not by any reasonable Norwegian standard, but I do not buy into the Marxist agenda of "from each according to his ability, to each according to his needs." That one is deeply and unapologetically unfair to those higher up on the ability ladder.
It may well be a necessary evil for the better-offs to contribute more (and we are talking percentages here, not absolute quantities) than those of lesser means, but it's... Wrong. In my book, it is, and deeply so. As such, this kind of unfairness should be avoided where possible, but tell that to the smug socialist "we carry everyone" crowd. You see, whether or not I have the capacity to spend all my money is quite beside the point; it's my money. What makes you think some crack whore deserves a chunk of it more than I do?
As for Recusant's trinity of negative traits... Is it possible that there is a connection between having those traits and getting wealthy in the first place? If you are willing to walk over corpses of friends... Well, it does open some doors, I suppose. Of course, then there is old money, but I hypothesise that it's far easier for a "blue blood" kid to grow up a spoiled little brat than it is for a "blue collar" kid, so... Same shit, different circumstances.
Apathy, are those the same 1%, you figure? If so, I kind of want to make a stand by the above. It doesn't relate to liking pain and suffering as much as it does to being willing to tolerate it to further personal agenda. Then there is my example from earlier. Me hiring a robot in stead of a human does not necessarilly have to benefit me directly. When hiring people or putting in machines in stead, I have to do that in the company's best interests or at least set those interests above my own, no..? Hiring based solely on the perkiness of one's boobs just seems so... 1958.
I don't agree with my above mentioned statement about the 1% liking pain and suffering. It just seems ridiculous and part the otherization process people go through to add meaning to their personal ideals. I know a guy who is pretty well off thanks to his parents and when he complains to a lower class about his personal issues he is met with invalidation because he has more money and therefore doesn't have problems because money can fix it, according to them. That dynamic really bugs him and would be worse off if he didn't meet me.
I, for one, welcome our new robot overlords.
Quote from: Apathy on January 02, 2017, 08:19:09 PM
I, for one, welcome our new robot overlords.
My alien overlords, the cats, don't agree with you :).
Quote from: Asmodean on January 02, 2017, 04:54:03 PM
I'm not wealthy myself - not by any reasonable Norwegian standard, but I do not buy into the Marxist agenda of "from each according to his ability, to each according to his needs." That one is deeply and unapologetically unfair to those higher up on the ability ladder.
Unfair? That may be a useful concept in some contexts but I often find use of the word grating.
That one deeply and unapologetically disadvantages the advantaged. I did it without appeal to the Fairness Fairy, didn't even need a ladder.
Quote from: Asmodean on January 02, 2017, 04:54:03 PMYou see, whether or not I have the capacity to spend all my money is quite beside the point; it's my money. What makes you think some crack whore deserves a chunk of it more than I do?
No it's not your money, otherwise it wouldn't be being redistributed to the detestable crack whores.
You may hire a factory to make your widgets, pay the owner and he'll redistribute to crack whores.
Yes, fairness is a loaded concept. Your way of putting is fine by me.
The money IS mine until I pay my taxes, and I assure you that I pay a far higher percent of my income for the same services and privileges than ye average crack whore does. Basically, I'm paying more for the use of the same roads, hospitals, schools, retirement facilities, environmental causes I don't even subscribe to AND a chunk of my coin does go into getting the crack whore fed, housed and possibly even off crack while the best I can hope for from that person is... Disappointing sex in exchange for narcotics.
It seems to me that the wealth redistribution is balanced quite heavily towards the crack whore and away from me.