[B] アメリカと日本が直面する「核のゴミ」の中間貯蔵問題】で討論

3 months 2 weeks ago
日頃よりお世話になっております、新外交イニシアティブ(ND)事務局です。9月27日、オンラインシンポジウム【アメリカと日本が直面する「核のゴミ」中間貯蔵問題】を開催します。アメリカのゲストから「核のゴミ」をめぐる現状や地元での反対運動について報告を受け、日本の事情に詳しい専門家、中間貯蔵施設が設置された青森県むつ市、そして建設計画が浮上している山口県上関町の方々とともに、日米が直面する中間貯蔵問題について討論します
日刊ベリタ

【Bookガイド】9月に刊行の“推し本”紹介=萩山 拓(ライター)

3 months 2 weeks ago
  ノンフィクション・ジャンルからチョイスした気になる本の紹介です(刊行順・販価は税別)。◆武塙麻衣子『酒場の君』書肆侃侃房 9/2刊 1500円 『群像』2024年6月号より小説「西高東低マンション」を連載中の作家である著者が、「私には私だけの酒場白地図が頭の中にあり、好きなお店や何度も行きたいお店、行ってみたいお店などを、日々その地図に少しずつ書き込んでいく」─こうして仕上がった酒場放浪記から浮かび上がる、ホロ酔い観察のすばらしさが、何とも言えない爽快さを呼び起こす..
JCJ

【鹿児島県警不祥事隠ぺい】札幌でも緊急集会=北海道支部

3 months 2 weeks ago
 JCJ北海道支部は7月26日、鹿児島県警の一連の不祥事で、告発文書が郵送された札幌のライター小笠原淳さん=写真=を迎え緊急集会「『事件隠ぺい』内部告発が問うもの」を札幌市内で開いた。 小笠原さんは「県警は『重要な証拠品なので文書を押収したい』と電話で持ち掛けてきた」と、前生活安全部長逮捕の4日後、初めての接触があったことを暴露。「情報源の秘匿」を理由に拒否し、「電話してきた若い捜査員に『押収』とは?と、令状も出さずに文書提出を求める理由を質すと、電話はぷつりと途絶えた」と話..
JCJ

School Monitoring Software Sacrifices Student Privacy for Unproven Promises of Safety

3 months 2 weeks ago

Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the country don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts. As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

That study also found that how to respond to alerts is left to the discretion of the school districts themselves. Due to a lack of resources to deal with mental health, schools often refer these alerts to law enforcement officers who are not trained and ill-equipped to deal with youth mental crises. When police respond to youth who are having such episodes, the resulting encounters can lead to disastrous results. So why are schools still using the software–when a congressional investigation found a need for “federal action to protect students’ civil rights, safety, and privacy”? Why are they trading in their students’ privacy for a dubious-at-best marketing claim of safety?

Experts suggest it's because these supposed technical solutions are easier to implement than the effective social measures that schools often lack resources to implement. I spoke with Isabelle Barbour, a public health consultant who has experience working with schools to implement mental health supports. She pointed out that there are considerable barriers to families, kids, and youth accessing health care and mental health supports at a community level. There is also a lack of investment in supporting schools to effectively address student health and well-being. This leads to a situation where many students come to school with needs that have been unmet and these needs impact the ability of students to learn. Although there are clear and proven measures that work to address the burdens youth face, schools often need support (time, mental health expertise, community partners, and a budget) to implement these measures. Edtech companies market largely unproven plug-and-play products to educational professionals who are stretched thin and seeking a path forward to help kids. Is it any wonder why schools sign contracts which are easy to point to when questioned about what they are doing with regard to the youth mental health epidemic?

One example: Gaggle in marketing to school districts claims to have saved 5,790 student lives between 2018 and 2023, according to shaky metrics they themselves designed. All the while they keep the inner-workings of their AI monitoring secret, making it difficult for outsiders to scrutinize and measure its effectiveness.

We give Gaggle an “F”

Reports of the errors and inability of the AI flagging to understand context keep popping up. When the Lawrence, Kansas school district signed a $162,000 contract with Gaggle, no one batted an eye: It joined a growing number of school districts (currently ~1,500) nation-wide using the software. Then, school administrators called in nearly an entire class to explain photographs Gaggle’s AI had labeled as “nudity” because the software wouldn’t tell them:

“Yet all students involved maintain that none of their photos had nudity in them. Some were even able to determine which images were deleted by comparing backup storage systems to what remained on their school accounts. Still, the photos were deleted from school accounts, so there is no way to verify what Gaggle detected. Even school administrators can’t see the images it flags.”

Young journalists within the school district raised concerns about how Gaggle’s surveillance of students impacted their privacy and free speech rights. As journalist Max McCoy points out in his article for the Kansas Reflector, “newsgathering is a constitutionally protected activity and those in authority shouldn’t have access to a journalist’s notes, photos and other unpublished work.” Despite having renewed Gaggle’s contract, the district removed the surveillance software from the devices of student journalists. Here, a successful awareness campaign resulted in a tangible win for some of the students affected. While ad-hoc protections for journalists are helpful, more is needed to honor all students' fundamental right to privacy against this new front of technological invasions.

Tips for Students to Reclaim their Privacy

Students struggling with the invasiveness of school surveillance AI may find some reprieve by taking measures and forming habits to avoid monitoring. Some considerations:

  • Consider any school-issued device a spying tool. 
  • Don’t try to hack or remove the monitoring software unless specifically allowed by your school: it may result in significant consequences from your school or law enforcement. 
  • Instead, turn school-issued devices completely off when they aren’t being used, especially while at home. This will prevent the devices from activating the camera, microphone, and surveillance software.
  • If not needed, consider leaving school-issued devices in your school locker: this will avoid depending on these devices to log in to personal accounts, which will keep data from those accounts safe from prying eyes.
  • Don’t log in to personal accounts on a school-issued device (if you can avoid it - we understand sometimes a school-issued device is the only computer some students have access to). Rather, use a personal device for all personal communications and accounts (e.g., email, social media). Maybe your personal phone is the only device you have to log in to social media and chat with friends. That’s okay: keeping separate devices for separate purposes will reduce the risk that your data is leaked or surveilled. 
  • Don’t log in to school-controlled accounts or apps on your personal device: that can be monitored, too. 
  • Instead, create another email address on a service the school doesn’t control which is just for personal communications. Tell your friends to contact you on that email outside of school.

Finally, voice your concern and discomfort with such software being installed on devices you rely on. There are plenty of resources to point to, many linked to in this post, when raising concerns about these technologies. As the young journalists at Lawrence High School have shown, writing about it can be an effective avenue to bring up these issues with school administrators. At the very least, it will send a signal to those in charge that students are uncomfortable trading their right to privacy for an elusive promise of security.

Schools Can Do Better to Protect Students Safety and Privacy

It’s not only the students who are concerned about AI spying in the classroom and beyond. Parents are often unaware of the spyware deployed on school-issued laptops their children bring home. And when using a privately-owned shared computer logged into a school-issued Google Workspace or Microsoft account, a parent’s web search will be available to the monitoring AI as well.

New studies have uncovered some of the mental detriments that surveillance causes. Despite this and the array of First Amendment questions these student surveillance technologies raise, schools have rushed to adopt these unproven and invasive technologies. As Barbour put it: 

“While ballooning class sizes and the elimination of school positions are considerable challenges, we know that a positive school climate helps kids feel safe and supported. This allows kids to talk about what they need with caring adults. Adults can then work with others to identify supports. This type of environment helps not only kids who are suffering with mental health problems, it helps everyone.”

We urge schools to focus on creating that environment, rather than subjecting students to ever-increasing scrutiny through school surveillance AI.

Bill Budington

[B] 「ガリ西サハラ難民大統領が国連事務総長に直談判」【西サハラ最新情報】  平田伊都子

3 months 2 weeks ago
1999年、東ティモールの人々はレファレンダム(国連人民投票)で独立を選び、25周年目を迎えました。 1999年、西サハラの人々もレファレンダム(国連人民投票)を予定していましたが、モロッコの反対で頓挫したままです。 東ティモールは、元ポルトガル首相で国連事務総長のアントニオ。グテーレスと、今もレファレンダム(国連人民投票)を待つ西サハラ代表のブラヒム・ガリを、第25回民族自決権国連人民投票式典に招待し、会わせました。
日刊ベリタ

Frontex goes drone shopping as EU looks to keep migrants out

3 months 2 weeks ago

"In total, Frontex spent roughly €275 million in pilot projects from 2014 to 2022 to research new technologies, many of them related to drones, Yasha Maccanico, a researcher at non-profit Statewatch and the University of Bristol, told Euractiv. 

(...)

For Maccanico, the increased spending on drones for “situational awareness” in “pre-frontier areas,” essentially means that Frontex will be able to identify vessels earlier, and closer to third-country borders, with patchy human rights records – such as Libya or Tunisia, and therefore push EU borders further back. 

Using more drones for surveillance means fewer European coastguards and Frontex vessels will be needed at sea. With fewer EU vessels, there is a greater chance that a non-EU country will respond to migrant boats, removing the obligation for EU responders to bring them ashore, said Maccanico."

Full story here.

Statewatch