Yu Chen, Binghamton University, State University of New York
There are many uses for digital systems that are not centrally controlled and that allow large numbers of people to participate securely, even if they don’t all know and trust each other.
Artificial intelligence looks like a political campaign manager’s dream because it could tune its persuasion efforts to millions of people individually – but it could be a nightmare for democracy.
The artificial intelligence boom means a multi-trillion dollar industry is coming into existence before our eyes. With great opportunity come great risks, as two important new Australian reports show.
I study artificial general intelligence, and I believe the ongoing fearmongering is at least partially attributable to large AI developers’ financial interests.
Figuring out how to regulate AI is a difficult challenge, and that’s even before tackling the problem of the small number of big companies that control the technology.
An expert explain the various concerns that were holding up FDA approval – from potential harmful side effects, to protecting the privacy of users’ brain-wave data.
Generative AI, those astonishingly powerful language- and image-generating tools taking the world by storm, come at a price: a big carbon footprint. But not all AIs are equally dirty.
Metaphorical black boxes shield the inner workings of AIs, which protect software developers’ intellectual property. They also make it hard to understand how the AIs work – and why things go wrong.
As the drone market continues to expand, a set of rules or standards that can help determine how they are used in warfare is needed, writes a former US diplomat.
Over the years Australia has been quick to point the finger at China – most recently in relation to DJI drones. Instead, we should look closely at our own tech security policies.
Antonio Pele, Pontifícia Universidade Católica do Rio de Janeiro (PUC-Rio)
Setting up AI-free ‘sanctuaries’ could allow us to reap the technology’s benefits while offering vital safeguards to our cognitive capacities and privacy.