香蕉视频直播

Skip to content

Project aims to find child exploitation

Local researchers aim to stop child sexual abuse images from reaching the Internet as online youth exploitation reaches alarming levels
90666kelownaKeyboard-and-handsW

On the heels of reports indicating that the distribution of online child sexual abuse material (CSAM) is reaching alarming proportions 香蕉视频直播 with the number of cases in Canada continuing to double year over year, according to the RCMP 香蕉视频直播 international eyes are focused on a group of Canadian researchers working to stop it.

Student researchers from the University of Manitoba are working with of Kelowna, to develop cutting-edge artificial intelligence software that will be the first in the world to accurately identify previously undetected child sexual abuse images and prevent distribution.

The student placements are part of a five-year program coordinated by , a national government-funded agency working to bridge the gap between academic research and business.

香蕉视频直播淥f all the issues we香蕉视频直播檙e solving to keep the Internet safe, this is probably the most important,香蕉视频直播 said Two Hat CEO Chris Priebe, noting that stopping CSAM is a challenge every child exploitation unit faces.

香蕉视频直播淓veryone would like to solve it but it香蕉视频直播檚 very challenging to tackle it because it香蕉视频直播檚 an extremely complex problem and it香蕉视频直播檚 in the darkest corner of the Internet,香蕉视频直播 he said.

香蕉视频直播淲hereas existing software tools search the Internet for known images previously reported to authorities as CSAM, Two Hat香蕉视频直播檚 product will accurately scan for images that exploit children as they are uploaded, with the ultimate goal of stopping them from ever being posted 香蕉视频直播 which is why global law enforcement and security agencies are watching closely,香蕉视频直播 said Two Hat head of product development Brad Leitch.

Current research indicates as many as 22 per cent of all teenage girls are sending inappropriate photos of themselves. In addition, statistics compiled by the RCMP show that child sexual abuse cases in Canada doubled in 2015 and again in 2016, highlighting the critical need for a tool to help tackle the issue.

香蕉视频直播淭his is a rampant global problem,香蕉视频直播 said Sergeant Arnold Guerin of the RCMP, 香蕉视频直播淭he ability to successfully detect and categorize newly distributed child sexual materials will be a game-changer in our fight against the online victimization of children.香蕉视频直播

The first phase of Two Hat香蕉视频直播檚 new product development involves students from the University of Manitoba who are at the leading edge of computer vision, deep learning and convolutional neural networks, the three main technologies being applied. Their work is particularly challenging because it is a criminal offence to view CSAM, meaning they are training computers to recognize images they themselves will never see.

香蕉视频直播淚t would be impossible to do this without the support of Mitacs,香蕉视频直播 said Priebe. By connecting our business with student interns, we香蕉视频直播檙e tapping into researchers at the top of their respective fields who are not afraid to tackle the impossible.香蕉视频直播

Over the next five years, the Mitacs interns will be working to develop software that identifies sexually abusive images of children with a high degree of accuracy. The end goal is a software tool that can proactively be applied to stop people from uploading CSAM and can also be used by law enforcement agents to quickly identify and prioritize new cases. In the case of teenagers, for example, it will be possible for the software to send a warning that the image they are about to upload from their phone, tablet or computer is inappropriate or illegal.

香蕉视频直播淪tudies have shown that if you can remind adolescents about the consequences of their actions, there香蕉视频直播檚 a high likelihood they won香蕉视频直播檛 do it,香蕉视频直播 said Priebe.

The second phase of the multi-year partnership with Mitacs will launch later this year and will involve researchers from Simon Fraser University and Laval University working to identify child sexual offenders on the Internet and prevent them from grooming child victims. Priebe emphasized that solving the problem of CSAM and child victimization on the Internet is a collaborative one.

香蕉视频直播淲e know that as soon as we can successfully place a CSAM detection system onto a network, people will stop using it for that purpose and we will have won that corner of the Internet,香蕉视频直播 he said.

 



About the Author: Black Press Media Staff

Read more



(or

香蕉视频直播

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }