Home / News / The World According to AI, Episode 1: Targeted by Algorithm | The Big Picture

The World According to AI, Episode 1: Targeted by Algorithm | The Big Picture

Artificial intelligence is already here.

There’s a lot of debate and hype about AI, and it’s tended to focus on the extreme possibilities of a technology still in its infancy. From self-aware computers and killer robots taking over the world, to a fully-automated world where humans are made redundant by machines, the brave new world of Artificial Intelligence is prophesied by some to be a doomed, scary place, no place for people.

For others, AI is ushering in great technological advances for humanity, helping the world communicate, manufacture, trade and innovate faster, longer, better.

But in between these competing utopian and dystopian visions, AI is allowing new ways of maintaining an old order.

It is being used across public and private spheres to make decisions about the lives of millions of people around the world – and sometimes those decisions can mean life or death.

“Communities, particularly vulnerable communities, children, people of colour, women are often characterised by these systems, in quite misrepresentative ways,” says Safiya Umoja Noble, author of the book, Algorithms of Oppression.

In episode one of The Big Picture: The World According to AI, we chart the evolution of artificial intelligence from its post-World War II origins and, dissect the mechanisms by which existing prejudices are built into the very systems that are supposed to be free of human bias.

We shed a harsh light on computerised targeting everywhere from foreign drone warfare to civilian policing. In the UK, we witness the trialling of revolutionary new facial recognition technology by the London Metropolitan Police Service.

We examine how these technologies, that are far from proven, are being sold as new policing solutions to maintain in some of the world’s biggest cities.

The Big Picture: The World According to AI explores how artificial intelligence is being used today, and what it means to those on its receiving end.

Watch Episode 2 here: https://youtu.be/dtDZ-a57a7k

– Subscribe to our channel: http://aje.io/AJSubscribe
– Follow us on Twitter: https://twitter.com/AJEnglish
– Find us on Facebook: https://www.facebook.com/aljazeera
– Check our website: https://www.aljazeera.com/

About coolgnvj_thenewsspace

Check Also

Iraq protests: Death toll rises to 20 as unrest spreads

ShareTweetPinGoogle+LinkedIn0shares Qi Wireless Car Charger with Auto Clamping Car Travel Bed Camping Inflatable Sofa WITHOUT …



  2. Black Sarah Sanders is hysterical. Typical of the left want to ban everything they don't like. We shouldn't ban facial recognition technology we should improve it.

  3. Good documentary until the lady started talking about liberal politics, nazis and white supremacists. She just had to her two cents in.

  4. run to equadotrian embassy ASAP

  5. If someone is a terrorist, they would probably like us using this machine learning because of the false negatives and false positives. Meaning through the methods that they use to hide themselves will get innocent people killed. We like them because it means fewer of our people on the ground there and potentially getting killed find the truth.

    Also I want someone that is going to judge me get to know me and everything about me, and get to know me personally. They need to get to know me and how I think at least as well if not better than I know myself and as well as family members and all of those that are close to me know me. We should get to know everyone that we are judging to the same level that we would expect someone to know us to make an accurate of us.

    These technologies need to get to know people well and that takes interaction with people as well as any psychoanalyst knows people that they are working to help. When we know how people think, not just the things that they buy, read, look at, what programs that they watch, and how they dress, and why they dress a certain way. Also the way to make accurate predictions is to know everyone very well, and know them very accurately. That way we can know how to help people when they have problems, so that they are enjoying life and people that are enjoying life are far less likely to do anything to harm society. Just seeking out people to attack is not a good approach, as the idea should be to help people instead of harming them.

    Just going after people to make quick decisions means that there is often not enough information to make a good unbiased decision. Such things do far more harm than good, and when there are lots of crime and only camera are used, quite often the crime does not go away or be diminished. What happens more often than not is the crime goes to areas where there are no cameras, and it causes more harm to those who are least likely to be able to defend themselves.

    My goals in life are to help others, and I want life better for everyone not just for myself. I can help others and even teach others technologies that many find difficult, even if I have problems doing many of the things that many in this world find easy such as tying their shoes.

  6. Problem with neural networks is we need to know what it is focusing on to make sure that it is going to make correct conclusions. A good example of this is when asked to tell the difference between a wolf and a dog, and the way that it figured it out is not from the picture of the animal, but the amount of snow in the picture it is not going to make the right choice.

  7. Problem with AI and computers is that when a mistake is made and put into the system those mistakes are multiplied many times and without testing the data that is put into such systems they will make larger and larger mistakes hurting and killing far more innocents than those that are a treat to us or anyone else.

  8. usa vneeds to fkoff n die ! direct from oregon .

  9. we need to have rebellion and refusal to credit cards.. and the banking system setting pricing to people needing their ownership cer their owned money.. that they the banks now have total control over from their ownership of the credit card system which is also in control of the money and of creating a system to fictionalize the worth of money and who then own's what is in what account.. according to what the banks want to set it as.. that as well can be set up by an A.I to do it all.. considering that everything is electronic.. that if I have a credit card with no cash in it they keep charging me for it.. when their no cash in the account.. from their account keeping fee's, and that I won't be able to use that credit card until their charges are paid for.. it's a blackmailers system… and that it was never credit before they called it that.. it was money not credit.. i never asked for a loan of credit.

  10. A big problem is these tools are used as political weapons, trying to take out those who do not agree or those who expose the corruption and crime in a party.

  11. Day 1: "searching for targets" who kill Rob pollute and destroy.

    Day 2: target acquired. Humans.

  12. Interesting… is that why Huawei 5G is banned? US can no longer track those data?
    Also why they are so worried about China’s social credit system

  13. ai is already connected to the internet in various ways. wait till both are connected to human brains. i believe this could be the essence of the mark of the beast. and it will eventually make people sick.

  14. A documentary by a feminist made up entirely of leftist extremists with a women are victims bit injected into the doc at the start despite the fact that the doc has nothing to do with women. I hope Aljazeera doesn't get taken over by feminists the same way the BBC was and all other media outlets in the west….I hope it doesn't get turned into a draconian, authoritative, corrupt institution full of censorship like all other media outlets taken over by liberals/feminists.

  15. Imagine: the rich retreat into gated communities, and AI powered drones hunt down and destroy any electronic signal they detect, forcing the rest of us to live in the stone age where we're no "threat" to them… I guess, kinda like the Gaza Strip, but for everyone. People in the gated communities will forget we even exist, apart from the stories they tell each other about our "savagery", and drones will never be whistle blowers. AI makes apartheid perfect.

  16. Do you no the deafints be twin the nobeals and the reabeals and the mangoes and the hippy and the chaeartukeys and the yeates do you no the deafints of the Nobel and the repetitions

  17. Did you now the ganaster are in the lords Armey and you can see the poor people and fight deamions in the church till you come up TO him in spirit

    So is the matickes community and this counsusmen skandel

  18. ☝😈👎 DAJJAL'S 🚫 AGES ☝😈👎

  19. Al Jazeera's quality of journalism has rapidly increased and it is now standing on par with the yester years media houses like CNN, BBC etc.

  20. Give AI DNA. IA will give you images.

  21. Obama was one of the worst President in the history of the country. Drone wars is one of many terrible things he did. As time goes on more people will realize this.


  23. Good
    Like u Jazeera

  24. If a yemeni civilian is killed its a mistake … if an american is killed its terrorism

  25. Very good show. You have made it very clear. Thanks you

  26. I’ve always felt that these drone strikes are the biggest stain on Obama’s legacy.

Leave a Reply

Your email address will not be published. Required fields are marked *