0 0
Read Time:2 Minute, 5 Second

Adolescents are consistently bombarded with promotional material for e-cigarettes and vaping products on Instagram, contravening regulatory policies and potentially fueling youth usage, a recent study reveals.

Researchers investigating the prevalence of e-cigarette-related content on Instagram found that the majority of posts pertaining to these products are directed towards young audiences, violating both federal regulations and Instagram’s own policies on branded content.

Presenting their findings at the American Psychiatric Association (APA) 2024 Annual Meeting on May 4, Jessica Tran, a fourth-year medical student at the University of Texas Medical Branch, emphasized the urgent need for stricter enforcement of FDA regulations and Instagram’s content policies to curb adolescent exposure to promotional content.

Tran’s study employed a simulated Instagram profile of a 14-year-old girl to analyze posts related to e-cigarettes and vaping products. Despite regulations mandating warning labels on tobacco products, including e-cigarettes, and Instagram’s ban on branded content promoting such items since 2019, the researchers uncovered significant gaps in compliance.

Of the 51 e-cigarette-related posts analyzed, approximately two-thirds depicted e-cigarette use positively, with only a quarter offering a negative portrayal. Furthermore, over half of the posts were promotional in nature, violating Instagram’s branded content policy, while nearly half were shared by influencers or vape shops, exacerbating the issue.

Moreover, the study found that a significant portion of posts did not include warnings about age restrictions or the addictive nature of nicotine, further flouting FDA regulations aimed at protecting youth.

Commenting on the study’s implications, Dr. Howard Liu, Chair of the Department of Psychiatry at the University of Nebraska Medical Center, underscored the importance of disseminating accurate health information, particularly on social media platforms where youth are highly active. He emphasized the role of healthcare organizations in countering misinformation and providing reliable resources.

Rob Morris, CEO of Koko, a nonprofit utilizing artificial intelligence to detect harmful mental health content, highlighted the challenges of content moderation on social media platforms. He stressed the necessity for collaboration between companies and organizations to address gaps in existing systems.

The study, which received no specific funding, serves as a stark reminder of the persistent challenges posed by the promotion of e-cigarettes to adolescents on social media platforms like Instagram, urging for concerted efforts to safeguard youth from potential harm.

In the face of evolving slang and hashtags, ensuring compliance with regulations and content policies remains an ongoing challenge, necessitating collaborative approaches between regulatory bodies, social media platforms, and public health advocates.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %