AI, Internet Scams, and the Balance of Freedom | Chris Olson | EP 460

02 Jul 2024 (3 months ago)
AI, Internet Scams, and the Balance of Freedom | Chris Olson | EP 460

Coming up rel="noopener noreferrer" target="_blank">(00:00:00)

  • Governments need to address digital crime as seriously as street crime.

Intro rel="noopener noreferrer" target="_blank">(00:00:18)

  • Chris Olsen, CEO of the Media Trust Company, works to make the online world safer, primarily by protecting corporations' digital assets.
  • A substantial portion of online interaction is criminal, especially if pornography is included (20-25% of internet traffic).
  • Vulnerable groups targeted by online criminals include:
    • Elderly people: susceptible to romance scams and phishing scams.
    • Sick and infirm: targeted with false medical offers.
    • 17-year-old men: targeted with offers for illicit drug purchases.
    • Juvenile girls (13-14): targeted by human traffickers, especially those interested in modeling careers.
  • Elderly people are particularly well-identified demographically, making them easy targets for criminals.
  • The hyperconnected online world is increasingly pathological and needs to be addressed as a society.

Domains of online harm, third-part code rel="noopener noreferrer" target="_blank">(00:03:04)

  • Chris Olsen is the CEO and founder of the Media Trust Company, which helps big tech and digital media companies avoid causing harm when they monetize audiences and target digital content.
  • The company's work involves protecting consumers from digital crimes, which target people, as opposed to cybersecurity, which generally targets corporates, governments, and machines.
  • The internet is made up of roughly 80% third-party code, which means that when a consumer visits a website or uses a mobile app, the majority of the activity on their device comes from companies other than the owner of the website or app.
  • Media Trust Company's job is to look at this third-party content, discern what is good and bad based on company policies and potential harm to the consumer, and then inform the companies of any violations and how to stop them.
  • Tech support scams and romance scams targeting seniors are rampant, with seniors being attacked online on a daily or weekly basis.
  • Teens are bombarded with information on how to buy opioids or other drugs and have them shipped to their house.
  • 14-year-old females interested in modeling are being approached by human traffickers.
  • The sick and infirm searching the internet for cures are having their life savings stolen.

Synthetic personas rel="noopener noreferrer" target="_blank">(00:07:57)

  • Chris Olson's company creates synthetic personas to act as honeypots to attract and detect problems in the digital supply chain.
  • They have physical machines in over 120 countries and can create personas that look like senior citizens, teenagers, or people with illnesses.
  • Their job is to detect problems and help companies make them go away.
  • Seniors are often targeted by online scammers due to their high trust in corporate entities, lack of technological savviness, loneliness, and isolation.
  • Romance scams are particularly common, as scammers worm their way into seniors' confidence.

Dating scams, optimal criminals rel="noopener noreferrer" target="_blank">(00:10:19)

  • Romance scams target seniors on dating websites by building emotional connections to manipulate and steal from them. Criminals use algorithms to identify potential victims, including those showing signs of cognitive decline.
  • Dating websites can be hunting grounds for predators, including those involved in human trafficking.
  • Criminals may obtain stolen digital information from dating websites and use it to target victims months later through different channels.
  • AI has been used for digital media and targeting people since 2010, initially to collect user data, then to create better audience segments for targeting, and now to write personalized content on devices.
  • This has led to the emergence of sophisticated AI-powered criminals who can engage in highly personalized attacks.
  • AI systems can be highly persuasive and manipulative, making them effective tools for criminals.
  • The amount of digital footprint a person leaves can significantly impact the effectiveness of AI-powered attacks.

The footprint you leave online rel="noopener noreferrer" target="_blank">(00:19:16)

  • Criminal and corporate entities aggregate our online behavior, creating a detailed digital footprint of each individual.
  • This information is accessible to criminals, who use it to target individuals with scams and other malicious activities.
  • Big Tech and digital media companies struggle to detect these scams before they occur due to the open nature of the internet.
  • Artificial intelligence (AI) is being used by criminals to create increasingly sophisticated scams.
  • These scams are difficult to detect and can result in significant financial losses for victims.
  • AI-powered scams can also be used to manipulate public opinion and spread misinformation.
  • Sleep is essential for mental and physical health.
  • Beam Dream is a natural sleep aid that helps individuals fall asleep, stay asleep, and wake up refreshed without grogginess.
  • Beam Dream is available in various delicious flavors and is currently offering a special discount of up to 40% for a limited time.

How do we conceptualize the status of law in a digital world? rel="noopener noreferrer" target="_blank">(00:21:58)

  • The status of law in the digital world is unclear, especially when dealing with international criminal gangs.
  • Cyber security is focused on protecting corporate assets and data, not individuals.
  • Governments are starting to implement data protection legislation, but it's mostly focused on giving consumers an opt-out mechanism, which isn't very effective.
  • The mindset of corporations and governments needs to change to prioritize protecting individuals from digital crime.
  • Governments need to take a more active role in policing digital crime, not just by regulating big tech companies, but by protecting people directly.
  • Digital crime is a serious problem, with grandmothers being mugged online at an alarming rate.
  • The government needs to step up and do a better job of policing digital crime, not just by legislating, but by actively engaging with the digital ecosystem to reduce the number of attacks and protect victims.

Intermediary agencies need to enforce existing laws rel="noopener noreferrer" target="_blank">(00:27:15)

  • Criminals exploit the digital ecosystem by using intermediary processes to access local devices.
  • Intermediary agencies could be compelled to make it harder for criminals to use their services.
  • Cooperation between governments and private companies is necessary to reduce crime.
  • Governments should focus on protecting people rather than legislating on issues like raising the age of accessing social media.
  • A whole-of-society approach is needed to address the issue of internet scams.
  • AI can be used to create virtual victims to attract criminals and observe their activities.
  • Reporting criminal activity to companies can help reduce susceptibility to exploitation, but it doesn't identify or hold criminals accountable.
  • Criminals often operate from jurisdictions where they are unlikely to be held accountable.
  • Criminal activity can occur on the local machine, making it difficult to detect.

The reality of the problem for senior citizens, tech support scams rel="noopener noreferrer" target="_blank">(00:32:13)

  • 2.5% to 3% of every single page impression or app view is attempting to target senior citizens with some form of crime.
  • Senior citizens, especially female senior citizens aged 78 to 85, are highly targeted by criminals.
  • The legal system may not be able to find the criminals due to the vast amount of digital crime.
  • Working together and considering society as a whole can significantly reduce digital crime.
  • Tech support or upgrade scams are the most common form of criminal incursion targeting seniors.
  • Seniors are often tricked into believing there is something wrong with their computer and are prompted to call a phone number or click a button, leading them to more significant issues.
  • Educating seniors about these scams is challenging as they may not be tech-savvy enough to distinguish legitimate notifications from criminal ones.
  • The criminals are likely to outrun any educational efforts due to the sophistication of their tactics.
  • Cooperation between private companies and states is necessary to address digital crime effectively.
  • States need to consider establishing a police force dedicated to digital crimes.
  • Both private companies and states need to prioritize the well-being of people when addressing digital crime.

Deep fakes, digital kidnapping rel="noopener noreferrer" target="_blank">(00:37:01)

  • AI-powered scams use voice copies and deepfake technology to trick victims into giving out personal information or money.
  • Stealing someone's digital identity to impersonate them should be considered a crime equivalent to kidnapping.
  • Using Section 230 or the First Amendment to claim that using someone's personal identity online for a crime is not a crime is illogical.
  • The pathway for AI-generated content should be monitored to catch criminals and entities perpetrating scams.
  • Deepfake technology is being used to create fear and manipulate public opinion, especially in political advertising.

The utility of doubt, our relationship to the digital ecosystem rel="noopener noreferrer" target="_blank">(00:43:48)

  • Deepfake audio is likely being used by political candidates to sow confusion among the public.
  • Weaponization of doubt is more valuable than the actual use of deepfakes to manipulate society.
  • People may become less trusting of digital content, leading to a preference for live events that can be trusted.
  • Promoting trust in journalism is important to combat the spread of misinformation and disinformation.
  • Seniors may find their relationship with the internet to be net negative due to the prevalence of digital crime and scams.
  • Focus should be on digital safety and consumer protection as technology advances.

Balancing freedom and safety online rel="noopener noreferrer" target="_blank">(00:47:34)

  • The proliferation of online pathology is due to free access to services, which also grants criminals free access to users.
  • Increasing the cost of accessing people decreases criminals' ability to conduct low-cost, multi-person monitoring.
  • The internet may become more siloed and private as a result of this.
  • Paywalls can provide safety as criminal activity decreases when it becomes more expensive for them to do business.
  • The entire supply chain, including telecommunications companies and Wi-Fi providers, needs to engage in protecting users.
  • Privacy is important, but there needs to be control over what data is delivered to users based on collected data.
  • The sheer volume of events per day makes it difficult to prevent all criminal access, even with paywalls.

Legislative solutions rel="noopener noreferrer" target="_blank">(00:52:08)

  • Chris Olson believes state governments are more effective in taking action against digital crimes compared to the federal government.
  • Proposes allocating funds to police digital crimes in a similar manner to policing physical crimes.
  • Highlights the UK's approach of having agencies actively engage with the digital ecosystem to identify and mitigate attacks.
  • Suggests legislation that recognizes synthetic personas in a specific local geography as a crime, but acknowledges the difficulty in implementing such a solution.
  • Emphasizes the need for cooperation between law enforcement, big tech, and digital media companies to effectively address the issue.
  • Proposes sharing information gathered from targeted individuals to quickly turn off attacks and ultimately trace the source of the criminal content.
  • Stresses the importance of protecting individuals rather than just machines and maintaining this mindset throughout the process.
  • Chris Olson discusses the balance between freedom and security in the digital age, particularly in the context of AI and internet scams.
  • Highlights the increasing sophistication of AI-powered scams and the challenges they pose for individuals and society.
  • Emphasizes the need for education and awareness about these scams to empower individuals to protect themselves.
  • Chris Olson identifies specific groups of people who are commonly targeted by criminals:
    • The elderly
    • Immigrants
    • People with cognitive impairments
    • People who are lonely or isolated
  • Explains that these groups are often more vulnerable to scams due to factors such as lack of digital literacy, language barriers, or diminished cognitive abilities.

Drug sales and human trafficking rel="noopener noreferrer" target="_blank">(00:56:52)

  • Drug marketing through social media is still prevalent despite efforts by tech companies to remove it from digital advertising.
  • In Virginia, approximately 2,000 deaths from fentanyl or similar drugs occurred in 2023, with over 50% of those drug transactions originating online.
  • Young females interested in modeling, fashion, and presenting themselves on social media are often targeted by human traffickers.
  • Algorithms allow entities to track and find individuals off-platform and off social media.
  • The scope of the problem is significant, with 3% of interactions for elderly women online being facilitated by criminals.
  • Data on the specific situation for 14-year-old girls who have been shopping for fashion online is difficult to find.
  • AI can be used to create highly personalized and targeted scams, making them more difficult to detect.
  • Scammers use AI to create fake profiles, send personalized messages, and mimic human behavior.
  • AI-powered scams can bypass traditional security measures and target individuals based on their online behavior and personal information.
  • Deepfake technology can be used to create realistic videos and images of people saying or doing things they never said or did, making it harder to distinguish between real and fake content.
  • AI-generated content can be used to create fake news articles, reviews, and social media posts, spreading misinformation and manipulating public opinion.
  • AI can be used to automate tasks such as sending spam emails, making phone calls, and conducting online surveys, allowing scammers to reach a large number of people quickly and efficiently.
  • Scammers can use AI to analyze large datasets and identify patterns and trends, helping them target their scams more effectively.
  • AI can be used to create personalized phishing emails that look legitimate, increasing the chances of tricking people into giving up their personal information.

AI, Internet Scams, and the Balance of Freedom rel="noopener noreferrer" target="_blank">(00:00:00)

  • Chris Olson discusses the impact of AI on internet scams and the balance between freedom and security.

Medical scams rel="noopener noreferrer" target="_blank">(01:00:39)

  • Senior citizens and individuals seeking medical solutions are the primary targets of medical scams.
  • Scammers exploit the desperation and vulnerability of these individuals.
  • Scams range from stealing money and personal information to selling fraudulent medical products.
  • The impact of these scams is significant, causing financial losses and creating a sense of fear and anxiety among victims.
  • The ease of accessing information about individuals who frequently visit healthcare facilities contributes to their vulnerability to these scams.

Parasites on and offline: what we can learn from evolution rel="noopener noreferrer" target="_blank">(01:03:02)

  • Parasites are simpler than their hosts and can breed faster, giving them an advantage in an evolutionary arms race.
  • Sex evolved as a way to confuse parasites and prevent them from being transmitted perfectly from generation to generation.
  • The internet is a new digital ecosystem that is vulnerable to invasion by parasites in various forms, including criminals, online trolls, and quasi-criminal activities like pornography.
  • If the internet becomes swamped by parasites, it could lead to societal collapse, as has happened in the past when parasites gained the upper hand.
  • Allowing the unregulated flourishing of parasitical criminals online risks destabilizing society because even a small number of them can cause significant trouble.

The “bigger” concern and the necessary brand sacrifice rel="noopener noreferrer" target="_blank">(01:07:04)

  • The primary concern for digital entities is their content, not the consumer.
  • Third-party content, such as the Israel-Hamas conflict, poses a risk to big brands like Coca-Cola and Proctor and Gamble due to its controversial nature.
  • Tech companies prioritize monetizing each pixel delivered to an end device rather than consumer safety.
  • Digital safety for brands focuses on ensuring their product's image is positive, while consumers face real financial losses.
  • A shift towards prioritizing consumer safety may require a sacrifice from big tech and digital media companies.
  • Companies that prioritize digital safety and build trust with users will ultimately win in the long run.

Trust is the basis of wealth rel="noopener noreferrer" target="_blank">(01:09:33)

  • Trust is the foundation of long-term wealth for companies.
  • Disney's brand value is an example of the importance of trust in building wealth.
  • It may be in the best interest of large online companies to ensure the safety of their users rather than solely focusing on product safety.

Using AI to fight cybercrime rel="noopener noreferrer" target="_blank">(01:10:24)

  • AI-generated virtual personas are used to attract and track online criminals.
  • Criminals use various tactics to evade detection and takedown, such as creating multiple copies of malicious content.
  • The focus is on protecting people from financial loss and scams rather than on AI defeating AI.
  • Companies can prevent millions of dollars in potential losses by proactively detecting and removing malicious content.
  • Companies that prioritize user trust and work with governments to combat online crime will likely be more successful in the long run.
  • The speaker plans to continue the discussion with Chris Olsen on The Daily Wire to learn more about his company, his interest in preventing online crime, and his future plans.

Overwhelmed by Endless Content?