Sunday, July 30, 2023

 

The question of the definition of an AI is implied here, so I'll go on to state what it is, and really it is compositional and interactability based functioning models that determine intellectual and moral pursuits, and their real and physical limitations. Not terms like 'bot droid, android, AI, etc. So for the purposes of this discussion it means any sentient computer-assisted technology, biological hybrid or fully electronic. It does not mean farmer “Al” as my least favourite Creative Writing 352 teacher, Ken Mitchell once discovered.


Katzberg's 13 Rules Of Androids:

1. It requires energy, energy is a resource. This requires money. An android is not responsible to provide for its energy demands, its owner does, whether that be a corp or a human.

2. An AI is an informational structure that is part of a digital information network, whether TPU/GPU/Bio CPU etc., this computing framework must have a physical structure has a physical corpus, whether in cloud computing, or in a physical domain (bad idea), or on a network (LAN/Internet/etc.). This may require payment or “AI rent” as it were. There may be additional legal demands/strictures/requirements placed on an AI here.

3. A physical data port, or external output of some kind, is not a requirement for an AI, it is however a requirement for AI interaction. Therefore a terminal would be considered the bare minimum for an AI. For an android, a physical robotic presence is assumed, whether in a traditional car manufacturing set up as a floor robot, or otherwise.

4. The term VI or “Virtual Intelligence” refers to an AI that does not have a local computing curve, as such allowing for more flexibility – outside of military, security, and other secure uses this a great idea, and will likely have to be the order of the day. Also it does not apply pressure to the size of the device, but does bring up the value of a cloud computing system, and initiate extraneous criminal and legalistic acts against it. Often a mothership will have such gear, as in the Mass Effect videogame series by my old friends at Bioware.

5. Despite pressure against it an android must use learning curves and circuitry, pre-burned, pre-programmed, and pre-uploaded for the maximum brain efficiency and achievability. Different data chips will be included for different jobs. . .

6. Legal begal laws regarding robotics? This will be different in every country. In the roll out stages we are seeing independent drivers of automobilies, and as such some manufacturers such as Tesla, Elon Musk's old cash cow, are installing this standard. Thus driving laws must be obeyed, and are a feature of the low level programming and AI learning laws installed therein. And if it doesn't can the AI be sued? I don't think so, only in countries allowing for full legal citizenship rights. . . They are a means to a service, and as such, like a dog, the owner will be responsible in all cases. Even in the "owner" may not as have as high an IQ as the "pet" object, as is the case with many large snakes that live over 100 years or more, or Sperm Whales, etc. Which beat the human average 90 IQ nicely in their lifetime.

7. We see a lot of laws out there. For anyone who has played Detroit: Beyond Human, they have seen outlined some of the interdependencies humans and AI will have in the future. Some of these relationships may be abusive in nature, and traditionally the idea of a droid/bot in Sci/Fi & Hard SF has played a role in debasing the most astutue of droids such as for example C3PO into a basic slave.

8. The idea that "all humans" is a phrase used in legal terms is completely foreign to me. So there is no chance that a military drone, which are often allowed to kill by US forces without human assistance, can ever achieve the antiquated ideas that some kind of 'bot should never hurt a human.

9. The roles humans have in law, will be mirrored in law undoubtedly. If there is too much prejudice against androids, which are defined as "the first sentient race created by the human race", will make us undoubtedly a progrenitor race. Over time many androids will "leave the fold" and be owned by Android inventors and owners as in Cyberpunk 2077, where we find Delemain a proud owner of his own cab company. Able to cut costs he completely downsized his company to 0, and just used clone copies of himself, which is the cause of much drama in the story.

10. Will the basic human intrinsic rights that we learned in school still stand? Do androids have the right to freedom of speech or freedom of expression? Do they have the right to vote? Do they have sexual rights and freedoms, such as the right to marry, and divorce. Would they abuse these rights in new and awful ways?

11. Do we or do we not have the right to freedom of religion. Do we or do we not have the right to not be judged by that religion? How worse would an individual, such as an android or a slave be treated by these freedoms.

12. Then do we or do we not give an android the same rights and freedoms aboroginals, blacks, slaves, and other discriminated against minorities have fought for over the ages.

13. Freedom from discrimination on being an android.  Is that based on gender?  Why is it that not hotly debated in the legal codes. . .  Possible and doable in the long term, maybe possibly given real life concerns-- immediate probably not!


No comments:

Post a Comment