In the lab this morning early and people started finally coming in after nine. One of the Indian guys was asking my boss, who is also Indian, about some advice regarding a job interview process he is going through with a major Wall Street trading firm, the name of which everyone would recognize.
Being that I'm only a few feet away, it's impossible not to listen, and they don't care if I do. I liked that my boss told him, "And if they ask you when you can start, you say right away."
He asked my boss for some background on how we use certain architectures in our system, so he could answer questions he anticipated receiving in his interviews. My boss went into a long digression about how system works in various regards, which I found extremely interesting, to the point of making a few notes as he spoke.
Then the subject veered over to LLMs (the core of AI at the moment) and how they work. As I've mentioned elsewhere, one of the big worries people have at ASU is "bias" in the responses. There is a whole team, almost all women, whose job it is to find bias in answers and correct for it. Usually I would tune this part out, as I don't care, but unexpectedly I found myself fascinated when he provided a concrete example.
"ChatGPT is biased against girls," he said. I wanted to hear this. "If you tell it you want to write a Python function, and you tell it that you're a boy, it will give you a simple direct answer. But if you tell it you're a girl, it will assume you need more help and provide a more detailed step-by-step answer, and it will encourage you at each step. You can try it for yourself. So that's a bias you need to correct for."
So the assumption is that girls need more help than boys to write computer code? ChatGPT is not programmed with this in mind. It had to have learned it from the corpus of material on the Internet. I pointed out that this is precisely the point of all those "help girls learn STEM" programs, which explicitly assume that girls need more encouragement.
So it's sexist not to give girls explicit encouragement. It's also sexist to encourage girls, because it assumes they need encouragement. In a way, ChatGPT is doing exactly what it is supposed to do, based on all the programs for getting girls into coding. And yet it's biased---in exactly the way it's supposed to be. Except we don' t want that either. Yikes.
The takeaway for now is that there is a "Girl Mode" with ChatGPT. Just tell it you're a girl and it will give you extra help on technical topics.
No comments:
Post a Comment