A persuasive argument about artificial intelligence.
+ What is the difference between persuasion and manipulation?
+ what is the difference between prejudice and using your experience?
+ These are a couple of the questions that I have to answer while working on ai.
+ persuasion and experience are good, but manipulation and prejudice are bad, but why?
Prejudice and manipulation lead to inequality and we have logical arguments to favour fairness. Using your experience to see what is coming in order to get there quicker and with less work is fine when it gets you to the same place, but what if it doesn’t? What if you are skipping over the right answer and getting to the easy answer instead? The easy answer might be right, but then again it might not.
An experienced boiler repair person can often go straight to the problem and fix it in a tenth of the time. When they can’t the boiler doesn’t work so they know they need to investigate further.
The problem is that sometimes the boiler works. This doesn’t mean that the boiler is fixed, just that the intermittent problem has gone away for now. Let’s say that the boiler repair person is a man, they usually are. When you find out that she is named Lisa you can always investigate further.
There is nothing wrong with heuristics providing that you are checking them against reality.
Manipulation is persuasion with a mask on. If I am deliberately withholding pertinent facts then I have moved from persuasion to manipulation. Often you won’t want all the facts as wading through them will slow you down unnecessarily, but you need them available in case need to check your decision against reality.
Chatgpt and Bard and all the ai that works from large amounts of data has a lot of experience and can often skip right to the correct answer. What it can’t do is accept that Lisa is not a man. The experience is baked in and has become prejudice. Prejudice leads to inequality in this case.
All good boiler repair people are men. Fact check: Lisa is a woman. LLM : no Lisa is a man because all good boiler repair people are men!
In the spirit of openness I do know a person called Lisa who repairs boilers, but I have never asked chatgpt or Bard about her. I have seen the same problem come up though. I asked chatgpt what the fastest sea mammal was and was told it was a sailfish. I asked if a sailfish was mammal and was told no, it was fish. I then asked if the fastest sea mammal was a mammal? At the time it said no, the fastest sea mammal was a sailfish which was a fish and so not a mammal. I published this at the time and can no longer access chatgpt to check what it says. Given that I published it, there is probably a post process special case override to fix it.
The important thing is not a special case, it is the fact that it could not see the contradiction. It cannot accept that women can fix boilers. It is inherently prejudiced. This is dangerous, because no matter how many individual cases we fix it is learning from experience and setting it up as fact. Prejudice set in stone strengthens inequality.
Perhaps I am biased because I was the special case. I was told at school that I would be lucky if I got to take O levels at all, but ‘as a courtesy’ I could take the mock I was being left out of. In the end I was considered retarded by my school until at eleven a teacher realised I was the one answering the questions for my classmates. I took my O level maths a couple of years early and got a respectable degree (*) from one of the top universities in the world despite missing most of the final year due to ill health.
I came from extreme poverty and today’s artificial intelligence would be prejudiced against me. That is ironic as I am a white male which gives me a lot of advantages without taking into account my height. Oddly enough height is a better heuristic for determining pay than either colour or sex. Sadly, by becoming disabled, I have lost that advantage. I was lucky I had it, but it wasn’t fair and I shouldn’t have had it.
My son is disadvantaged in his competition to get into university because I (and my wife!) have paid for a good education for him. It is right that he should be disadvantaged! It would not be right to set his advantages into the AI black box as inherent superiority and give him still more advantages so that someone from my background has less.
Artificial Intelligence is being set up to disadvantage based on race, sex, wealth and probably height too. We should be struggling against inequality, not setting it in stone. Persuasion and experience, not manipulation and prejudice! The AI will have a million factors in its evaluation, but unless it can explain its reasoning openly it could be prejudiced against your children because they don’t wear the right brand name.