Apple has said that new technology will allow iPhones and iPads to be automatically checked and that if they find material related to sexual activity with children, they should report it to the authorities. Will be notified. Privacy activists, however, have expressed concerns.
How will the technology work?
Organizations working to protect children have created a database of sexually explicit images of minors, entitled CSAM. The software on Apple's products will automatically review the user's mobile device and take notice if any content is found in CSAM's database. As the user's mobile device stores controversial content in the cloud, automation technology will be activated. Child Pornography Platform Exposed: Closures, Arrests
Pakistan: The trend of child pornography in rural areas
Child Pornography: A New Law on 'Criminalization'
Apple has stated that the identification of the controversial content will be done with the help of cryptographic technology. Controversial material relating to children will not be accessible to the company but will be reported to the National Center for Missing and Exploited Children. This agency works with the police.
Concerns of digital rights groups have expressed concerns about the development. He says the introduction of this technology could lead to a bug in the software, which would be like a 'back door'. Governments and various groups can take advantage of this to access people's personal data. Affiliated with the digital rights group Electronic Frontier Foundation, Apple's move will make government agencies happy, but it is bad news for consumers who have seen Apple's better privacy over the years. Staying connected with the product.
Thousands of pastors in Germany lust after Apple's new technology background Apple's new image monitoring feature is one of the many tools that the company is going to introduce in the coming days. Apple wrote in an online post that with this technology, the company wants to keep children out of the reach of people who can use communication tools to lure children into their clutches for the wrong purposes.
Apple added that messaging apps and other applications will now inform children and parents about controversial content through machine learning. If the search results show controversial images of children, etc., the mobile device itself will blur the content and present it to the concerned authorities. In addition, Apple's assistant software 'Siri' will also help. When users try to search for prohibited content, Siri will activate and stop the process. This feature has also been introduced in Apple's new computers.
What a big problem child pornography is The US Department of Justice has called child pornography sexual exploitation of children. Making obscene pictures and videos of children, sharing them with others, watching them, and storing them, are serious crimes in most countries of the world. The US National Center for Missing and Exploited Children reviews 25 million images and videos per year related to child pornography. That's about 480,769 inspections every week. The Netherlands is the most prominent country in terms of child pornography content, with 52% of the total number of websites.