Artificial intelligence has entered offices, hospitals, and factories. Increasingly, it is entering homes. In Bengaluru, a software engineer named Pankaj Tanwar discovered just how powerful—and controversial—that shift can be after building a homemade AI surveillance system to monitor his kitchen. Suspicious that fruit was disappearing from his refrigerator, Tanwar installed a camera above the cupboard and created what he called an “AI roommate,” a system designed to track activity in real time and notify him of unusual behavior. – AI kitchen surveillance system.
Within a week, the system flagged two instances where fruit had been removed from the refrigerator without explanation. According to Tanwar’s public posts on X, formerly Twitter, the alerts helped confirm his suspicion that his cook was taking apples, bananas, and blueberries from the fridge. He subsequently dismissed the worker, who reportedly earned around ₹4,800 per month.
The episode quickly went viral across Indian social media and tech forums. Some users applauded the ingenuity of a do-it-yourself AI monitoring system built using affordable hardware and open-source tools. Others were disturbed by the implications: a domestic worker being secretly monitored by artificial intelligence inside a private home.
The story touches a nerve in a rapidly digitizing society. Surveillance technology that once required corporate infrastructure can now be assembled by hobbyists with a Raspberry Pi, a camera module, and open-source machine learning software. As the cost of AI falls, the boundary between smart home convenience and intrusive monitoring grows increasingly blurred.
In Tanwar’s case, a missing banana triggered a global conversation about the ethics of watching people with machines.
A Suspicion That Sparked a System
Stories of missing food from office refrigerators are common enough to be a running joke in workplaces. In Tanwar’s kitchen, however, the problem became a technical challenge.
According to posts shared online, Tanwar suspected that fruits stored in his refrigerator were disappearing more quickly than expected. Instead of confronting the cook directly, he turned to the tools he knew best: artificial intelligence and computer vision.
I see this mindset frequently among engineers and developers. When confronted with uncertainty, they build systems to observe the world more precisely. – AI kitchen surveillance system.
Tanwar mounted a small camera inside a cupboard overlooking the kitchen workspace and refrigerator. The device streamed video to a local computing unit where AI models analyzed activity in real time.
The result was an automated monitoring system capable of detecting actions such as:
- Opening the refrigerator
- Removing objects from shelves
- Cleaning or cooking activities
- Handwashing behavior
Within days, the system had generated a report identifying two occasions where fruit had been removed without explicit permission.
For Tanwar, the evidence confirmed what he had suspected. For the internet, it ignited a debate.
Read: GeoSpy AI Can Identify Your Exact Location From a Single Photo
How the “AI Roommate” Worked
Unlike expensive commercial surveillance systems, Tanwar’s setup relied on accessible hardware and open-source software commonly used in hobbyist AI projects.
The core components included a small computer, a camera, and machine learning models trained to recognize objects and human actions.
Core Hardware Setup
| Component | Purpose | Approximate Cost |
|---|---|---|
| Raspberry Pi 4 or 5 | Local computing for AI inference | $70–$120 |
| USB or Pi Camera Module | Video capture in kitchen | $30–$50 |
| 64GB MicroSD Card | Local storage for event clips | $10–$20 |
| Power Adapter and Mount | Camera placement and power | $10–$20 |
The entire system could be assembled for under $150, making it far cheaper than professional security infrastructure.
Video feeds were processed locally on the Raspberry Pi, where lightweight computer vision models analyzed each frame. The system detected specific objects such as apples, bananas, or other items while also identifying human presence.
When an event occurred—such as the refrigerator door opening—the system logged the action and generated a notification.
The simplicity of the design reflects a broader trend: AI capabilities that once required cloud computing are now possible on inexpensive hardware.
Software Behind the Surveillance
The software layer of the system combined several open-source technologies commonly used in the maker and home automation communities. – AI kitchen surveillance system.
Frigate NVR, a popular AI-powered video surveillance platform, handled object detection and motion tracking. It uses machine learning models such as YOLO to identify objects in camera feeds.
Additional scripts written in Python processed events and generated alerts.
Example Software Stack
| Software Tool | Function |
|---|---|
| Raspberry Pi OS | Operating system for device |
| Frigate NVR | AI-based object detection and surveillance |
| YOLO Models | Real-time object recognition |
| OpenCV | Image processing and analysis |
| Local LLM | Behavioral summaries and reports |
The system also integrated a language model capable of summarizing activity logs.
For example, a weekly report might read:
“Person accessed refrigerator twice. Apple removed once. Handwashing detected once.”
This feature transformed raw video into structured behavioral summaries.
The combination of computer vision and language models is increasingly common in modern AI systems.
Face Blurring and Local Processing
One of Tanwar’s stated goals was maintaining privacy by processing all video locally and blurring faces automatically.
Face detection models scanned each frame of the video feed to identify facial landmarks. Once detected, OpenCV software applied a Gaussian blur to obscure the face before the footage was stored or analyzed.
This process occurred in real time.
By blurring faces, the system attempted to reduce the identifiability of individuals in recorded clips.
However, critics pointed out that even with face blurring, the system still captured behavior and actions inside a private domestic environment. – AI kitchen surveillance system.
According to computer vision researcher Hany Farid, surveillance technology increasingly enables behavior analysis without identifying faces. “Even anonymized video can reveal patterns of activity that raise privacy concerns,” Farid has noted in research on digital image analysis (Farid, 2016).
In other words, anonymity does not necessarily eliminate surveillance.
The Viral Moment on Social Media
When Tanwar shared the story online, he likely expected a small discussion among technology enthusiasts.
Instead, the post spread rapidly across social media.
Screenshots describing the “AI roommate” circulated widely, triggering thousands of comments and reposts. The story was picked up by tech blogs and online forums where debates about ethics, labor rights, and surveillance erupted.
Supporters praised the creativity of the solution.
Many engineers admired the ingenuity of using AI tools to solve a mundane household problem.
Critics saw something very different.
For them, the incident symbolized a growing culture of digital surveillance directed at workers who often lack legal protections.
One viral comment summarized the backlash:
“Pays ₹4800 per month. Spies on worker with AI. Brags about it online.”
The incident highlighted how technological innovation can clash with social expectations.
Public Reactions: Innovation or Intrusion?
Online responses quickly divided into two camps.
Social Media Reactions
| Viewpoint | Argument |
|---|---|
| Supporters | Property theft justifies monitoring |
| Critics | Surveillance violates worker privacy |
| Technologists | Example of creative DIY AI |
| Labor advocates | Exploitation of informal workers |
Supporters emphasized that theft—regardless of value—represents a breach of trust. For them, Tanwar’s system demonstrated how technology can protect property.
Critics argued the opposite.
They pointed out that domestic workers often operate within informal labor arrangements lacking clear legal protections.
The use of hidden surveillance raised concerns about dignity and consent.
Technology ethicist Shoshana Zuboff has warned that surveillance technologies often expand beyond their original purpose. “Surveillance systems thrive on collecting behavioral data,” she wrote in The Age of Surveillance Capitalism (Zuboff, 2019).
Even in private homes, the dynamics of observation and power remain complex.
Domestic Workers and Power Imbalances
The controversy also revealed deeper social tensions around domestic labor.
India employs millions of domestic workers, many of whom work informally without written contracts or standardized wages.
According to the International Labour Organization, domestic work worldwide often lacks adequate legal protections, leaving workers vulnerable to exploitation (ILO, 2021).
In such contexts, surveillance technologies can amplify existing power imbalances.
Critics argued that monitoring a low-paid worker with AI while earning far more as a technology professional reflects structural inequalities.
The cook reportedly earned approximately ₹4,800 per month, equivalent to roughly $50–$60.
To some observers, firing someone over missing fruit while employing advanced monitoring technology seemed disproportionate.
Yet others countered that trust is fundamental in domestic work relationships.
Once broken, the employment relationship becomes difficult to maintain.
The Future of DIY AI Surveillance
Regardless of the ethical debate, the technical reality remains clear: building AI monitoring systems at home is becoming easier.
Affordable hardware, open-source software, and accessible machine learning models have dramatically lowered the barrier to entry.
Today, hobbyists can build systems capable of:
- Object detection
- Activity recognition
- Behavioral analysis
- Automated alerts
Many developers experiment with these systems for home security, pet monitoring, or automation.
Tanwar himself has suggested expanding the system beyond surveillance.
Planned upgrades reportedly include gas leak detection and environmental monitoring.
Additional features under development include:
- Idle-time tracking
- Fan speed automation
- Safety alerts for kitchen hazards
These improvements highlight how quickly smart home technology is evolving.
Surveillance in the Age of Smart Homes
The story of a kitchen camera reflects a broader technological shift.
Homes increasingly contain networks of sensors, cameras, and AI-powered devices.
Smart speakers, security cameras, and connected appliances already collect vast amounts of data about daily routines.
AI surveillance simply adds another layer of analysis.
Privacy scholar Helen Nissenbaum has argued that technological change often disrupts established social norms around information sharing. “Contextual integrity breaks down when information flows in unexpected ways,” she wrote in research on digital privacy (Nissenbaum, 2010).
In other words, technology may introduce new capabilities before society fully agrees on acceptable uses.
The question raised by the Bengaluru incident is not simply about a missing apple.
It is about how far technology should reach into everyday human relationships.
Ethical Boundaries for Home AI Systems
As AI tools spread into domestic settings, questions about ethical boundaries become increasingly urgent.
Should workers be notified if they are being monitored?
Should certain areas of the home remain off-limits to surveillance?
Should recordings be stored, shared, or deleted automatically?
Legal frameworks for domestic surveillance remain inconsistent across countries.
Some jurisdictions require explicit consent for recording individuals, while others permit monitoring inside private property.
Technology companies developing smart home products increasingly emphasize transparency features such as visible recording indicators.
Yet DIY systems built by individuals often lack such safeguards.
The Bengaluru incident demonstrates how quickly personal technology experiments can raise societal questions.
Takeaways
- A Bengaluru engineer built an AI surveillance system to monitor activity in his kitchen after suspecting fruit theft.
- The homemade system used a Raspberry Pi, computer vision models, and a camera to detect actions like refrigerator access.
- Alerts and weekly reports summarized kitchen activity using language models.
- The system flagged two instances of fruit removal, leading the engineer to dismiss his cook.
- Social media reactions were divided between praise for technical ingenuity and criticism over privacy concerns.
- The incident highlights growing ethical debates around AI surveillance in domestic environments.
Conclusion
Technology often enters society quietly, one small experiment at a time. A camera above a refrigerator might seem like a minor innovation, especially to an engineer accustomed to building digital systems.
Yet that single camera triggered a conversation reaching far beyond one kitchen in Bengaluru.
Artificial intelligence is steadily moving into private spaces once untouched by automated observation. Devices that can recognize objects, interpret behavior, and generate reports are no longer confined to laboratories or corporations. They can be built on a dining table with inexpensive hardware.
That accessibility carries both promise and risk.
On one hand, AI systems can enhance safety, automate routine monitoring, and provide useful insights about household activity. On the other, they can transform relationships built on trust into relationships governed by surveillance.
The controversy surrounding Tanwar’s “AI roommate” illustrates the delicate balance between technological capability and social responsibility.
As artificial intelligence becomes a household tool, society must decide how—and where—to draw the line.
FAQs
What was the Bengaluru AI kitchen surveillance incident?
A software engineer built an AI monitoring system using a camera and computer vision to track kitchen activity and detect suspected food theft.
How did the AI system detect theft?
The system used object detection models to identify actions such as opening the refrigerator and removing items like fruit.
Was the system connected to the cloud?
No. The setup reportedly processed video locally on a Raspberry Pi device to avoid uploading footage to external servers.
Why did the story go viral?
Many people debated whether using AI surveillance on a domestic worker was innovative problem-solving or an invasion of privacy.
Can anyone build a similar system?
Yes. With affordable hardware, open-source AI models, and basic programming skills, individuals can create home surveillance systems for under $150.