BKC Affiliate Aviv Ovadya speaks about the need for more robust testing for GPT-4.
"That is the core tension with any new technology. There’s a problem if there’s too many inputs and too much potential for the technology to be used in fraud or for manipulation. The question is, can we create tools that help protect people from that — tools that take in all that text and figure out if a tool is being used for fraud or being used for manipulation? We need to be identifying the leverage points and the opportunities where you can use the technology itself to protect people from it, and giving access to that technology to people who are helping build that out and supporting the implementation of this sort of resilience technology. We need to do that as much as possible before we put it out into the world."
Read and listen in Marketplace.
You might also like
- communityThe apocalypse that wasn’t