TCju
2021-08-06
Good corporate practices
Why Apple's Plan To Scan iPhones For Child Sexual Abuse Content Is Worrying Experts
免责声明:上述内容仅代表发帖人个人观点,不构成本平台的任何投资建议。
分享至
微信
复制链接
精彩评论
我们需要你的真知灼见来填补这片空白
打开APP,发表看法
APP内打开
发表看法
{"i18n":{"language":"zh_CN"},"detailType":1,"isChannel":false,"data":{"magic":2,"id":893178930,"tweetId":"893178930","gmtCreate":1628251193020,"gmtModify":1633752252809,"author":{"id":3578361458056498,"idStr":"3578361458056498","authorId":3578361458056498,"authorIdStr":"3578361458056498","name":"TCju","avatar":"https://static.laohu8.com/default-avatar.jpg","vip":1,"userType":1,"introduction":"","boolIsFan":false,"boolIsHead":false,"crmLevel":3,"crmLevelSwitch":0,"individualDisplayBadges":[],"fanSize":0,"starInvestorFlag":false},"themes":[],"images":[],"coverImages":[],"extraTitle":"","html":"<html><head></head><body><p>Good corporate practices </p></body></html>","htmlText":"<html><head></head><body><p>Good corporate practices </p></body></html>","text":"Good corporate practices","highlighted":1,"essential":1,"paper":1,"likeSize":0,"commentSize":0,"repostSize":0,"favoriteSize":0,"link":"https://laohu8.com/post/893178930","repostId":1167569709,"repostType":2,"repost":{"id":"1167569709","kind":"news","weMediaInfo":{"introduction":"Stock Market Quotes, Business News, Financial News, Trading Ideas, and Stock Research by Professionals","home_visible":0,"media_name":"Benzinga","id":"1052270027","head_image":"https://static.tigerbbs.com/d08bf7808052c0ca9deb4e944cae32aa"},"pubTimestamp":1628233505,"share":"https://www.laohu8.com/m/news/1167569709?lang=&edition=full","pubTime":"2021-08-06 15:05","market":"us","language":"en","title":"Why Apple's Plan To Scan iPhones For Child Sexual Abuse Content Is Worrying Experts","url":"https://stock-news.laohu8.com/highlight/detail?id=1167569709","media":"Benzinga","summary":"Apple has introduced new features to detect child sexual abuse content on iPhones in the U.S., but t","content":"<p><b><a href=\"https://laohu8.com/S/AAPL\">Apple</a></b> has introduced new features to detect child sexual abuse content on iPhones in the U.S., but the move is worrying experts.</p>\n<p><b>What Happened</b>: Apple said Thursday that a new system will detect images of child exploitation called Child Sexual Abuse Material (CSAM) and match the image against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.</p>\n<p>The matching process is done on the user’s iPhone before the image is uploaded to the iCloud. If a matching image is found, it will be manually reviewed.</p>\n<p>On confirmation of child pornography, the user’s iCloud account will be disabled and the incident will be reported to law enforcement.</p>\n<p>Apple said that the new features will come later this year in updates to its operating software for iPhones, Apple Watches and Macs.</p>\n<p><b>Why It Matters:</b> <a href=\"https://laohu8.com/S/AAPL\">Apple</a> has been under pressure from law enforcement around the world to weaken its encryption so that it would help in the investigation of terrorism or child exploitation. The launch of the new features will help to alleviate some of those concerns.</p>\n<p>However, security experts are worried that the technology could be eventually be expanded to scan phones for other prohibited content. It could also be used by authoritarian governments to spy on dissidents and protestors.</p>\n<p><b>Matthew Green</b>, a cryptography researcher at Johns Hopkins University, warned that the new features will “break the dam” and governments will demand data from everyone.</p>\n<p>\"[The] problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,\" Green said.</p>\n<blockquote>\n Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you (and just a bunch of opaque numbers, even if you hack into your phone to get the list.)\n</blockquote>\n<blockquote>\n — Matthew Green (@matthew_d_green) August 5, 2021\n</blockquote>\n<p><b>Price Action</b>: Apple shares closed less than 0.1% higher in Thursday’s regular trading session at $147.06, but declined 0.11% in the after-hours session to $146.90.</p>","collect":0,"html":"<!DOCTYPE html>\n<html>\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n<meta name=\"viewport\" content=\"width=device-width,initial-scale=1.0,minimum-scale=1.0,maximum-scale=1.0,user-scalable=no\"/>\n<meta name=\"format-detection\" content=\"telephone=no,email=no,address=no\" />\n<title>Why Apple's Plan To Scan iPhones For Child Sexual Abuse Content Is Worrying Experts</title>\n<style type=\"text/css\">\na,abbr,acronym,address,applet,article,aside,audio,b,big,blockquote,body,canvas,caption,center,cite,code,dd,del,details,dfn,div,dl,dt,\nem,embed,fieldset,figcaption,figure,footer,form,h1,h2,h3,h4,h5,h6,header,hgroup,html,i,iframe,img,ins,kbd,label,legend,li,mark,menu,nav,\nobject,ol,output,p,pre,q,ruby,s,samp,section,small,span,strike,strong,sub,summary,sup,table,tbody,td,tfoot,th,thead,time,tr,tt,u,ul,var,video{ font:inherit;margin:0;padding:0;vertical-align:baseline;border:0 }\nbody{ font-size:16px; line-height:1.5; color:#999; background:transparent; }\n.wrapper{ overflow:hidden;word-break:break-all;padding:10px; }\nh1,h2{ font-weight:normal; line-height:1.35; margin-bottom:.6em; }\nh3,h4,h5,h6{ line-height:1.35; margin-bottom:1em; }\nh1{ font-size:24px; }\nh2{ font-size:20px; }\nh3{ font-size:18px; }\nh4{ font-size:16px; }\nh5{ font-size:14px; }\nh6{ font-size:12px; }\np,ul,ol,blockquote,dl,table{ margin:1.2em 0; }\nul,ol{ margin-left:2em; }\nul{ list-style:disc; }\nol{ list-style:decimal; }\nli,li p{ margin:10px 0;}\nimg{ max-width:100%;display:block;margin:0 auto 1em; }\nblockquote{ color:#B5B2B1; border-left:3px solid #aaa; padding:1em; }\nstrong,b{font-weight:bold;}\nem,i{font-style:italic;}\ntable{ width:100%;border-collapse:collapse;border-spacing:1px;margin:1em 0;font-size:.9em; }\nth,td{ padding:5px;text-align:left;border:1px solid #aaa; }\nth{ font-weight:bold;background:#5d5d5d; }\n.symbol-link{font-weight:bold;}\n/* header{ border-bottom:1px solid #494756; } */\n.title{ margin:0 0 8px;line-height:1.3;color:#ddd; }\n.meta {color:#5e5c6d;font-size:13px;margin:0 0 .5em; }\na{text-decoration:none; color:#2a4b87;}\n.meta .head { display: inline-block; overflow: hidden}\n.head .h-thumb { width: 30px; height: 30px; margin: 0; padding: 0; border-radius: 50%; float: left;}\n.head .h-content { margin: 0; padding: 0 0 0 9px; float: left;}\n.head .h-name {font-size: 13px; color: #eee; margin: 0;}\n.head .h-time {font-size: 11px; color: #7E829C; margin: 0;line-height: 11px;}\n.small {font-size: 12.5px; display: inline-block; transform: scale(0.9); -webkit-transform: scale(0.9); transform-origin: left; -webkit-transform-origin: left;}\n.smaller {font-size: 12.5px; display: inline-block; transform: scale(0.8); -webkit-transform: scale(0.8); transform-origin: left; -webkit-transform-origin: left;}\n.bt-text {font-size: 12px;margin: 1.5em 0 0 0}\n.bt-text p {margin: 0}\n</style>\n</head>\n<body>\n<div class=\"wrapper\">\n<header>\n<h2 class=\"title\">\nWhy Apple's Plan To Scan iPhones For Child Sexual Abuse Content Is Worrying Experts\n</h2>\n\n<h4 class=\"meta\">\n\n\n<div class=\"head\" \">\n\n\n<div class=\"h-thumb\" style=\"background-image:url(https://static.tigerbbs.com/d08bf7808052c0ca9deb4e944cae32aa);background-size:cover;\"></div>\n\n<div class=\"h-content\">\n<p class=\"h-name\">Benzinga </p>\n<p class=\"h-time\">2021-08-06 15:05</p>\n</div>\n\n</div>\n\n\n</h4>\n\n</header>\n<article>\n<p><b><a href=\"https://laohu8.com/S/AAPL\">Apple</a></b> has introduced new features to detect child sexual abuse content on iPhones in the U.S., but the move is worrying experts.</p>\n<p><b>What Happened</b>: Apple said Thursday that a new system will detect images of child exploitation called Child Sexual Abuse Material (CSAM) and match the image against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.</p>\n<p>The matching process is done on the user’s iPhone before the image is uploaded to the iCloud. If a matching image is found, it will be manually reviewed.</p>\n<p>On confirmation of child pornography, the user’s iCloud account will be disabled and the incident will be reported to law enforcement.</p>\n<p>Apple said that the new features will come later this year in updates to its operating software for iPhones, Apple Watches and Macs.</p>\n<p><b>Why It Matters:</b> <a href=\"https://laohu8.com/S/AAPL\">Apple</a> has been under pressure from law enforcement around the world to weaken its encryption so that it would help in the investigation of terrorism or child exploitation. The launch of the new features will help to alleviate some of those concerns.</p>\n<p>However, security experts are worried that the technology could be eventually be expanded to scan phones for other prohibited content. It could also be used by authoritarian governments to spy on dissidents and protestors.</p>\n<p><b>Matthew Green</b>, a cryptography researcher at Johns Hopkins University, warned that the new features will “break the dam” and governments will demand data from everyone.</p>\n<p>\"[The] problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,\" Green said.</p>\n<blockquote>\n Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you (and just a bunch of opaque numbers, even if you hack into your phone to get the list.)\n</blockquote>\n<blockquote>\n — Matthew Green (@matthew_d_green) August 5, 2021\n</blockquote>\n<p><b>Price Action</b>: Apple shares closed less than 0.1% higher in Thursday’s regular trading session at $147.06, but declined 0.11% in the after-hours session to $146.90.</p>\n\n</article>\n</div>\n</body>\n</html>\n","type":0,"thumbnail":"","relate_stocks":{"AAPL":"苹果"},"is_english":true,"share_image_url":"https://static.laohu8.com/e9f99090a1c2ed51c021029395664489","article_id":"1167569709","content_text":"Apple has introduced new features to detect child sexual abuse content on iPhones in the U.S., but the move is worrying experts.\nWhat Happened: Apple said Thursday that a new system will detect images of child exploitation called Child Sexual Abuse Material (CSAM) and match the image against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.\nThe matching process is done on the user’s iPhone before the image is uploaded to the iCloud. If a matching image is found, it will be manually reviewed.\nOn confirmation of child pornography, the user’s iCloud account will be disabled and the incident will be reported to law enforcement.\nApple said that the new features will come later this year in updates to its operating software for iPhones, Apple Watches and Macs.\nWhy It Matters: Apple has been under pressure from law enforcement around the world to weaken its encryption so that it would help in the investigation of terrorism or child exploitation. The launch of the new features will help to alleviate some of those concerns.\nHowever, security experts are worried that the technology could be eventually be expanded to scan phones for other prohibited content. It could also be used by authoritarian governments to spy on dissidents and protestors.\nMatthew Green, a cryptography researcher at Johns Hopkins University, warned that the new features will “break the dam” and governments will demand data from everyone.\n\"[The] problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,\" Green said.\n\n Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you (and just a bunch of opaque numbers, even if you hack into your phone to get the list.)\n\n\n — Matthew Green (@matthew_d_green) August 5, 2021\n\nPrice Action: Apple shares closed less than 0.1% higher in Thursday’s regular trading session at $147.06, but declined 0.11% in the after-hours session to $146.90.","news_type":1},"isVote":1,"tweetType":1,"viewCount":698,"commentLimit":10,"likeStatus":false,"favoriteStatus":false,"reportStatus":false,"symbols":[],"verified":2,"subType":0,"readableState":1,"langContent":"EN","currentLanguage":"EN","warmUpFlag":false,"orderFlag":false,"shareable":true,"causeOfNotShareable":"","featuresForAnalytics":[],"commentAndTweetFlag":false,"andRepostAutoSelectedFlag":false,"upFlag":false,"length":22,"xxTargetLangEnum":"ORIG"},"commentList":[],"isCommentEnd":true,"isTiger":false,"isWeiXinMini":false,"url":"/m/post/893178930"}
精彩评论