Developments in the field of AI (artificial intelligence) are moving fast. So fast, that prominent figures from the science and tech sectors are calling for a six-month pause in developing systems more powerful than OpenAI’s newly launched GPT-4. They did so yesterday, in an open letter citing potential risks to society and humanity. Among the 1,100 signatories are numerous academics and tech pioneers, such as Steve Wozniak and Elon Musk.
The break should be used to establish security measures for dealing with increasingly complex and sophisticated AI. The letter writes about AI systems that can compete with humans in various tasks. The writers question whether non-human systems should be developed that are smarter than humans. But, these issues should not be answered by unelected leaders of tech giants or academics but by politicians.
Earlier this month, OpenAI unveiled the fourth version of its Generative Pretrained Transformer (GPT) AI program, which surprised users with its broad range of applications. For example, you can use it in human-like conversations, to compose songs, or let it summarize lengthy documents.
Potential misuse of the system
Among other things, the letter addresses potential risks to society from AI systems in the form of economic and political disruptions. Developers are urged to work with policymakers on administrative and regulatory issues.
The concerns come as EU police force Europol on Monday joined a chorus of ethical and legal concerns over advanced AI like ChatGPT, warning about the potential misuse of the system in phishing attempts, disinformation, and cybercrime.
Selected for you!
Innovation Origins is the European platform for innovation news. In addition to the many reports from our own editors in 15 European countries, we select the most important press releases from reliable sources. This way you can stay up to date on what is happening in the world of innovation. Are you or do you know an organization that should not be missing from our list of selected sources? Then report to our editorial team.