Menu Fermer

Articles sur Machine learning

Affichage de 221 à 240 de 289 articles

The South Korean go player Lee Sedol after a 2016 match against Google’s artificial-intelligence program AlphaGo. Sedol, ranked 9th in the world, lost 4-1. Lee Jin-man/Flickr

No, artificial intelligence won’t steal your children’s jobs – it will make them more creative and productive

The history of human-machine collaboration suggests that AI will evolve into a “cognitive partner” to humankind rather than as all-powerful, all-knowing, labour replacing robots.
In the Global Biodiversity Information Facility there are 682,447 records of human encounters with dandelions. from www.shutterstock.com

AI is learning from our encounters with nature – and that’s a concern

Does big data threaten how humans explore the natural world? We need to protect our impulses to observe, compare, play, discover and love, no matter what technological capabilities are available.
How can computers learn to teach themselves new skills? baza178/Shutterstock.com

Teaching machines to teach themselves

For future machines to be as smart as we are, they’ll need to be able to learn like we do.
All those neurones: if only a machine could really think like a human. MriMan/shutterstock

Why Google wants to think more like you and less like a machine

Computers today are fast and powerful but they still can’t think like a human when it comes to some tasks we find easy. That’s why tech companies are turning to neuroscience for help.
Prometheus statue at Rockefeller Center, Manhattan. The inscription behind it is a paraphrase of Aeschylus that reads: “Prometheus, teacher in every art, brought the fire that hath proved to mortals a means to mighty ends”. Wikimedia

Internet of Things: between panacea and paranoia

How the idea of a hyper-connected society could quickly go from utopia to dystopia and why neither scenario is likely to last.
file lwc. CEA

Cracking big data with statistical physics

Methods stemming from decades of research on disordered materials are used to describe algorithmic phase transitions, and to design new algorithms in machine-learning problems.

Les contributeurs les plus fréquents

Plus