Forensic and Privacy Analysis of LLM Mobile Apps


Forensic Analysis and Privacy Risks of LLM Mobile Apps: ChatGPT, Copilot, and Gemini

Organized by: International Forensic Scientist Awards
Website: forensicscientist.org

13th Edition of Forensic Scientist Awards 28-29 August 2025 | Berlin Germany

Introduction
Large Language Model (LLM) apps such as ChatGPT, Microsoft Copilot, and Google Gemini are becoming part of our daily lives. From answering questions to assisting with translation, research, and customer support, these AI-powered apps provide fast, human-like responses. But behind the convenience lies a critical question: What data are these apps collecting and how does it affect your privacy?

Why This Matters
When installed on mobile devices, LLM apps gain access to sensitive personal data such as:

  • Account information

  • Location data

  • Device details

  • Browser cookies

While these permissions often enable smoother functionality, they can also create forensic artifacts—digital traces that investigators (and potentially malicious actors) can use. More importantly, such access can raise privacy concerns if the data is over-collected or misused.

The Case Study: ChatGPT, Copilot, and Gemini
Our analysis looked closely at how these three leading apps store and manage user data:

  • ChatGPT: Stores conversation history in plain text along with browser data on both Android and iOS.

  • Copilot: Similarly keeps conversation and browser data in plain text, leaving it open to potential forensic recovery.

  • Gemini: Takes a different approach by storing conversation data, browser activity, and even images entirely in the cloud. This data can be extracted using Google Takeout.

Forensic Value vs. Privacy Risk
From a forensic perspective, these stored artifacts can be valuable for investigations—helping to trace activities, communications, or even malicious usage of the apps. However, from a privacy standpoint, the situation is concerning:

  • Plain text storage (ChatGPT and Copilot) means data is easily accessible if the device is compromised.

  • Cloud storage (Gemini) centralizes sensitive information, increasing the risk if accounts are hacked.

Key Takeaways

  1. Users should be aware that their conversations and activities may not be fully private.

  2. Developers must prioritize encryption and privacy-by-design principles.

  3. Investigators can leverage these artifacts for digital forensics, but this dual-use also highlights risks of misuse.

Conclusion
AI-powered mobile apps like ChatGPT, Copilot, and Gemini make our lives easier, but they also come with hidden forensic and privacy implications. As these tools become more integrated into daily life, balancing their usefulness with strong data protection is more important than ever.

🔗 Learn more and apply at:

https://forensicscientist.org/

Nominations Open Now: Click here

–––––––––––––––––––––––––––––––––––––

Get Connected Here:

🔹You Tube: Watch on YouTube

🔹Twitter: Follow on Twitter

🔹Instagram:  Follow on Instagram

🔹WhatsApp Channel: Follow on WhatsApp


Comments