Logo
    Search

    Cybertraps Podcast

    We explore the risks arising from the use and misuse of digital devices and electronic communication tools. We interview experts in the fields of cybersafety, cybersecurity, privacy, parenting, and technology and share the wisdom of these experts with you!
    enJethro Jones & Frederick Lane173 Episodes

    Episodes (173)

    Digital Safety for Kids in the Connected World with “Officer E”

    Digital Safety for Kids in the Connected World with “Officer E”

    Graig Erenstoft, known as Officer E, is the digital safety expert for Connected Class. He has been presenting to parents and students for over 24 years and is currently a police lieutenant in Florida. 


    In this episode, Officer E and host Ross Romano talk about:


    • Defining digital safety for kids and its critical importance in the connected world
    • The “Take Three for Digital Safety” video series for parents
    • Common questions parents have about digital safety
    • Facilitating parent-child conversations
    • The fast-shifting digital landscape
    • Mental health challenges from social media
    • Tips to foster good, safe habits


    On Thursday, November 9 from 7-8 p.m. Eastern, Officer E will present a free webinar for parents and educators. If you’d like to attend, RSVP below:


    Take Three for Digital Safety: Tips for Keeping Your Child Safe

    RSVP Here or at https://connectedclass.com 

    About our guest

    Officer E serves as the digital safety expert for Connected Class. He hosts the series “Take Three for Digital Safety” which provides valuable resources and tips to help parents navigate the challenges presented by the latest technology and social media trends. He has been presenting to parents and students for over 24 years and is currently a police lieutenant in Florida. Prior to his law enforcement career, Officer E worked as a youth director for 12 years before making the leap into law enforcement. As a law enforcement officer, he has served as a School Resource Officer, Field Training Officer, and among other duties, as a leader in the youth Police Athletic League. He enjoys engaging with students and parents to provide them with tools and resources to make good choices and stay safe. Officer E uses his training and experience as a police officer, husband, and father of two boys to relate to the challenges parents face each day in an ever-changing world on optics such as bullying, drugs and alcohol, and technology and social media. 


    About today’s host

    Ross Romano is a co-founder of the Be Podcast Network and CEO of September Strategies LLC. He hosts The Authority Podcast, on which he interviews leading authors from the education world and beyond to draw out their invaluable insights on leadership, culture-building, transformation, and student & educator success. Listen here: https://authoritypodcast.net 


    Ross is a leadership development and performance coach for professionals in a range of industries and consults with organizations and high-performing leaders in the K-12 education industry to help communicate their vision and make strategic decisions that lead to long-term success. Connect on Twitter @RossBRomano and LinkedIn

    Teaching Cybersecurity using Sphero with Tod Johnston Cybertraps 158

    Teaching Cybersecurity using Sphero with Tod Johnston Cybertraps 158

    Tod Johnston, a Senior Education Content Manager at Sphero, discusses how his company uses robots to teach cybersecurity concepts to middle school students. Their robotic balls help students visualize abstract cybersecurity topics like man-in-the-middle attacks. Tod explains how they developed lessons in collaboration with cybersecurity experts to give students an initial understanding of cyber threats and how to act responsibly online. Tod hopes to expand these lessons to younger students in the future. The discussion also touches on the challenges of educating both students and adults about cybersecurity given that technology is evolving rapidly and privacy policies are often difficult to understand.

    • Sphero - Blueprint- basics of engineering 
    • Educators need to think about cybersecurity from a student’s perspective, rather than a technology perspective. 
    • We should be inviting students to learn about their privacy policies to help them make better choices. 
    • Sphero programmable balls are good for teaching programmable, algorithmic skills, but it’s always difficult to teach cybersecurity. 
    • An example of a man in the middle attack 
    • Can’t damage other people’s property 
    • Student in Miami-Dade who hacked the school district. 
    • Dr. Pauline Mosley collaborated on Sphero’s curriculum 
    • The hope for the future of designing software and hardware and what they should look like. 

    - How GDPR has ruined the web

    About Tod Johnston

    Tod Johnston is a Senior Education Content Manager at Sphero, leveraging over 10 years of experience in classroom settings. With a focus on classroom technology, math education, STEM, and the environment, Tod applies practical teaching expertise to positively impact technology integration in schools. He also has experience as a Learning Experience Designer – designing curriculum, presenting at conferences, and researching educational technology and math education trends. He is dedicated to transforming education through innovative approaches.

    AI Policies in Schools Cybertraps 157

    AI Policies in Schools Cybertraps 157

    In Cybertraps 157, the APLUS Framework for adopting AI in schools is discussed. The framework emphasizes Accessibility, Privacy and Ethics, Learner-centered approach, Usability, and Sustainability. The irony of principals wanting AI to assist them while trying to prevent students from doing the same is highlighted. Examples of AI policies, including a plagiarism policy, are mentioned. The importance of viewing AI as an ecosystem rather than just a tool is emphasized. A blog post is referenced, stating that 73% of something is discussed.

    Those Who Don't Read Science Fiction Are Doomed to Repeat It Cybertraps 156

    Those Who Don't Read Science Fiction Are Doomed to Repeat It Cybertraps 156

    In Cybertraps 156, the podcast discusses the potential dangers of AI-fitted teddy bears. These toys have the ability to read children personalized bedtime stories using private information they have overheard. The episode highlights a news item that warns about the privacy concerns associated with these “scary” gadgets.

    Bot's Up, Doc? with Jethro Jones Cybertraps 155

    Bot's Up, Doc? with Jethro Jones Cybertraps 155

    In this episode, Jethro and Fred discuss chatbots and artificial intelligence. The episode covers the history of chatbots, including the Turing Test and the development of Large Language Models (LLMs) such as ChatGPT, Bing, and Jasper. The potential uses and issues with chatbots are explored, including incomplete or misinformation, theft of intellectual property, inappropriate uses, and threats to various types of jobs. The episode also touches on the impact of chatbots on education and the potential for weaponization of disinformation, cybersecurity, and more emotion-targeted advertising.

    • Beginning of Cybertraps Podcast Episode Index
    • Writebettr.com - test out AI with your poorly written emails
    • AILeader.info - learn about AI and how to use it to save time with 3 minute masterclasses.
    • Today’s Topic: Bot’s Up, Doc?
    • Keynote delivered at last minute for Alaska Society for Technology in Education
    • Artificial Life and Artificial Intelligence
      • Why chatbots are NOT “artificial intelligence” – yet
    • “The Father of Chat”
      • The Turing Test
      • Alan Turing OBE FRS [1912–1954] – British mathematician and computer scientist
      • Leader in development of computer and algorithmic theory
      • At Bletchley Park, helped design a machine to crack the Enigma code
      • 1950 – Turing devises The Turing Test:
        • Can a computer produce answers indistinguishable from a human?
        • The Imitation Game
      • 1954 – Turing commits suicide
    • Large Language Models (LLMs)
      • ChatGPT (esp. 4)
      • Bing
      • Jasper
      • embedded AI
      • Photoshop
      • Google Workspace
      • incredibly rapid change
    • Current ChatGPT Issues
      • Incomplete Data or Misinformation
      • Theft of Intellectual Property
      • Inappropriate Uses
      • Response to MSU Shooting
      • Threat to a Various Types of Jobs
      • Mid-to Lower-Level Tech
      • Media / PR Professionals
      • Customer Service
      • Paralegals / Attorneys?
      • Religious Leaders?
      • Monetization
    • A Quick Object Lesson
      • Censorship Is a Biz-Kill
      • China Was a Tech Leader in 2010s
      • WeChat
      • AliPay
      • Beijing (CCP) Got Nervous
      • Party Officials Took Corporate Seats
      • Goal Was to Limit Social Influence
      • Chinese Tech Companies Slashed  Investment in Pure Research
    • ChatGPT and Education
      • A Flawed Resource for Students
      • Incomplete Information
      • Misinformation
      • Kids Will Use Technology to Cheat
        • Not the First Time …
        • Several Schools Have Had Cheating Scandals
        • NYC Blocked, then Unblocked, Access to ChatGPT
      • Responses and Solutions
      • Tools for Identifying Chat-Generated Content
      • Incorporate Chat Critiques into Curricula
      • The Revenge of the Palmer Method?
      • Create Assessments that ChatGPT Can’t Answer
    • The Parade of Horribles
      • Weaponization of Disinformation
      • Cybersecurity
      • Social Engineering
      • Scams and Spams
      • Manipulative Suggestions
      • Integration with Other Technologies
      • More Emotion-Targeted Advertising
      • Displaced Emotional Relationships
      • Personalized Chatbot (“Amanuensis”)
      • Fasten Your Seat Belt. It’s Going to Be a Bumpy Night.

    Armies of Enablers with Amos Guiora cybertraps 154

    Armies of Enablers with Amos Guiora cybertraps 154

    In this episode, Fred Lane interviews Amos Guiora, a law professor at the University of Utah. The bulk of the interview centers on Professor Guiora’s recently published book, "Armies of Enablers: Survivor Stories of Complicity and Betrayal in Sexual Assaults". In his book, Professor Guiora attempts to answer a difficult question:


    “What do sexual assault survivors expect of the enabler-bystander? In this powerful book, Amos N. Guiora shares the stories of survivors to expose how individual and institutional enablers allow predators to perpetrate their crimes through silence and other failures to act. He then proposes legal, cultural, and social measures aimed at the enabler from the survivor’s perspective.” 


    In addition to his work at the University of Utah law school, Professor Guiora has been active in S.E.S.A.M.E., the organization led by Terri Miller that is working the so-called “passing of the trash.”


    Frederick Lane is an author, attorney, educational consultant, and lecturer based in Brooklyn, NY. He is the co-founder of The Center for Cyberethics and is a nationally-recognized expert in the areas of cybersafety, digital misconduct, personal privacy, and other topics at the intersection of law, technology, and society. Lane has appeared on “The Daily Show with Jon Stewart,” CNN, NBC, ABC, CBS, the BBC, and MSNBC.


    He has written ten books, including most recently Cybertraps for Educators 2.0 (2020), Raising Cyberethical Kids (2020), and Cybertraps for Expecting Moms & Dads (2017). He is currently working on his newest book, _The Rise of the Digital Mob_ (Beacon Press 2022). All of his books are available on Amazon.com or through his Web sites, FrederickLane.com and Cybertraps.com.


    With Jethro Jones (The Transformative Principal), Lane co-hosts “The Cybertraps Podcast.” He is also the publisher of “The Cybertraps Newsletter” (newsletter.cybertraps.com).

    Ghana Update and the Growing Problem of Deepfakes Cybertraps 153

    Ghana Update and the Growing Problem of Deepfakes Cybertraps 153
    • Update from Ghana

    - #2023–03–13_1100 Meeting with the Cybercrime Unit of the Ghana Police Service - #2023–03–14_1200 Visit to 5/6 classroom at Primus Hybrid School - #2023–03–19_1400 Pan-Africa webinar for parents - How can parents and carers monitor their children’s online activity without infringing on their privacy? - What are the long-term effects of excessive technology use on children’s mental and physical health? - How can parents and carers stay informed about new technology trends and potential risks? - What should parents and carers do if they suspect their child is being cyberbullied or harassed online? - How can parents and carers effectively communicate with their children about technology use without creating conflict or tension? - How can parents and carers address their own technology use and set a good example for their children? - What is the role of peer pressure and social media in shaping children’s online behavior, and how can parents and carers help children navigate these pressures? - How can technology be used to enhance learning and development for children, and what are some best practices for incorporating technology into education? - How can parents and carers help children build healthy relationships with their devices and encourage offline activities and hobbies? - What is the role of technology companies and platforms in promoting safe and responsible technology use, and how can parents and carers hold them accountable? - The Growing Problem of Deepfakes - News Item: New York students create a deepfake video of middle school principal saying racist things “https://www.washingtonpost.com/nation/2023/03/14/racist-deepfakes-carmel-tiktok/” - Details - Target of the malicious attack was George Fischer Middle School - In late January or early February, multiple videos were released on Tiktok, with a male voice laid over videos of Principal John Piscitella - The voiceovers contained racist statements and threats of violence - TikTok quickly took the videos down but not before they were seen by multiple students - Carmel Central School District sent out a letter on February 13, 2023, alerting parents to the videos and saying “that three high-schoolers had “used artificial intelligence to impersonate the staff” and made them appear to make “inappropriate comments” in videos.” - The school did not describe the videos, nor did it specifically mention the racist comments or threats of violence - Simultaneous, local police closed their investigation after determining that no crime had been committed - The District defended its response to angry parents, saying that “they were trying to balance disclosing sensitive information without generating panic” - But parents accused the District of minimizing the videos - The videos raise many issues, most controversial: - Racism - Student Privacy - The Use and Abuse of Technology (particularly AI) - Threats of Gun Violence - Disciplinary action was taken against three students but District refused to say what action was taken - Analysis - Schools need to be more transparent about the nature of incidents like these - We may need to consider the cost of student privacy - These were relatively crude deepfake videos; the technology exists now to make much more convincing videos - Additional Resources - #2023–03–09 Principal appears to spew racist threats in disturbing video — but it never actually happened “https://www.msn.com/en-us/news/us/principal-appears-to-spew-racist-threats-in-disturbing-video-%E2%80%94-but-it-never-actually-happened/ar-AA18qImu” - #2023–03–08 High Schoolers Made a Racist Deepfake of a Principal Threatening Black Students “https://www.vice.com/en/article/7kxzk9/school-principal-deepfake-racist-video” - #2023–03–02 TikTok videos threatening Black students have Carmel parents on edge, district promising change “https://www.lohud.com/story/news/education/2023/03/02/racist-tiktok-videos-threaten-black-kids-in-carmel-ny-worry-parents/69941181007/” - Fred’s presentation at ASTE about Chatbots