Vibe Coding Plane & Satellite Tracking Revolution

Oliver Grant

February 25, 2026

Vibe Coding Plane

I have covered enough waves of developer tooling to recognize when something shifts from incremental improvement to structural change. What creative technologist Bilawal Sidhu recently built feels like one of those moments. Using Claude 4.6 and Gemini 3.1, he constructed a real-time plane and satellite tracking dashboard that looks like it belongs in a classified intelligence bunker rather than a solo developer’s workspace. – vibe coding plane & satellite tracking.

In practical terms, the system pulls live aircraft telemetry, satellite positions, Austin traffic camera feeds, and panoptic object detection into a unified geospatial interface. It integrates ADS-B flight data, satellite orbital parameters, interactive 3D rendering, and stylized shaders inspired by electro-optical and FLIR military displays. It was built not through meticulous line-by-line coding, but through what Sidhu calls vibe-coding: guiding multiple AI agents in parallel with voice notes, screenshots, and natural language prompts.

The result resembles an aerospace command center. The process behind it signals something larger. Solo developers are beginning to orchestrate fleets of AI models the way startups once assembled engineering teams. The frontier is no longer just what models can do. It is how humans direct them.

The Dashboard That Looks Classified

Sidhu’s system aggregates disparate data streams into a unified geospatial interface. Live aircraft tracking draws from APIs such as AviationStack and AirLabs, which provide flight status and ADS-B data. Satellite telemetry integrates orbital data similar to what Aviation Edge and N2YO make publicly accessible.

What distinguishes the project is not the data sources themselves. Flight and satellite tracking APIs have existed for years. What changes the equation is the speed and fluidity of assembly.

The interface overlays real-time aircraft positions, NORAD satellite IDs, and camera feeds onto a dynamic globe. Interactive 3D visualizations allow zooming between orbital paths and street-level traffic. Shaders simulate CRT scan lines and infrared overlays, mimicking intelligence dashboards seen in films.

Sidhu reportedly ran up to eight AI agents simultaneously. Some handled frontend shaders. Others wired geospatial APIs. Still others built telemetry ingestion pipelines. Instead of writing each module by hand, he orchestrated model collaboration. – vibe coding plane & satellite tracking.

Read: Claude Data Extraction Controversy Reshapes Global AI Competition

What Is Vibe-Coding?

Vibe-coding represents a shift from syntax-driven programming to intent-driven orchestration. Rather than specifying precise implementation details, developers provide high-level goals, aesthetic direction, and functional constraints. AI systems generate and iterate on the underlying code.

The approach builds on trends accelerated by tools like GitHub Copilot and large language models capable of code synthesis. In 2021, GitHub reported that nearly 40 percent of code generated in Copilot-enabled environments was machine-produced (GitHub, 2021).

Vibe-coding goes further. It combines:

  • Natural language prompts
  • Voice memos describing desired behavior
  • Screenshot annotations
  • Parallel AI agents assigned role-specific tasks

Computer scientist Andrej Karpathy described similar patterns as “programming in English,” where prompts replace low-level syntax (Karpathy, 2023). The emphasis shifts from writing code to directing intelligence.

Sidhu’s dashboard demonstrates how that philosophy scales in practice. – vibe coding plane & satellite tracking.

Multi-Agent Systems as Solo Force Multipliers

Running eight agents simultaneously transforms development into orchestration. In multi-agent AI architectures, a coordinating agent decomposes a project into subtasks and assigns them to specialized instances.

Research into agent collaboration has accelerated since large models began demonstrating tool-use capabilities. Google DeepMind’s work on agentic reasoning shows how models can plan, execute, and revise complex tasks (DeepMind, 2023).

Sidhu reportedly used Gemini 3.1 for heavy computational and visualization tasks, including live telemetry streaming and interactive 3D rendering. Claude 4.6 supported geospatial wiring and shader logic.

The structure resembles a software team:

Agent RoleModel UsedTask Focus
CommanderGemini 3.1 ProTask decomposition and orchestration
Telemetry EngineerGemini 3.1Live flight/satellite data streams
Geospatial IntegratorClaude 4.6Map layers, API connections
Shader SpecialistClaude 4.6EO/FLIR/CRT visual effects
Testing & Debug AgentGemini 3.1Error detection and iteration

The effect is leverage. One developer approximates a multidisciplinary team.

The Models Behind the Build

Claude is developed by Anthropic and emphasizes safety and code synthesis. Gemini is developed by Google DeepMind and has demonstrated strong performance in reasoning and multimodal tasks.

Google introduced Gemini in December 2023, positioning it as a multimodal model capable of handling text, images, audio, and video (Google, 2023). Anthropic’s Claude models have gained adoption for structured reasoning and extended context windows. – vibe coding plane & satellite tracking.

In benchmark comparisons, frontier models increasingly perform competitively in coding tasks. OpenAI’s GPT-4 demonstrated strong coding performance in the 2023 technical report (OpenAI, 2023). Gemini and Claude have similarly emphasized developer use cases.

Sidhu’s project leverages both strengths. Gemini handles dynamic visual rendering and complex data flows. Claude synthesizes structured geospatial logic. The combination mirrors ensemble systems common in machine learning research.

APIs Powering Real-Time Tracking

The dashboard depends on accessible data infrastructure. AviationStack and AirLabs provide RESTful JSON endpoints for live flight data. Satellite APIs such as Aviation Edge and N2YO supply Two-Line Element sets, enabling orbital position calculation. – vibe coding plane & satellite tracking.

These services often offer free tiers with rate limits suitable for prototypes. ADS-B data, originally designed for aviation safety, has become a foundation for public flight tracking.

A comparison of typical API features illustrates the ecosystem:

API ServiceData TypeFree TierCommon Use Case
AviationStackFlight statusYesReal-time aircraft tracking
AirLabsGlobal flightsYesLive aviation dashboards
Aviation EdgeSatellitesLimitedNORAD-based satellite maps
N2YOOrbital elementsYesISS and satellite tracking

Integrating these feeds traditionally required backend engineering expertise. AI-assisted synthesis lowers that barrier.

From ARCore to Aerospace Dashboards

Sidhu’s background contextualizes the experiment. As a former product manager at Google, he worked on Immersive View and ARCore Geospatial APIs. ARCore, introduced in 2018, allowed developers to anchor digital content to physical coordinates (Google Developers, 2018).

That experience with spatial computing informs the dashboard’s geospatial fluency. Real-time tracking becomes not just a data display but a spatial narrative.

Sidhu has also built NeRF-based 3D reconstructions and generative AI video experiments. VentureBeat and PetaPixel have covered his spatial AI explorations. His audience of more than a million followers consumes tutorials blending AI, VFX, and spatial computing.

The tracking dashboard fits within that trajectory. It combines aerospace telemetry with cinematic interface design, merging technical precision and aesthetic storytelling.

Expert Perspectives on AI-Orchestrated Development

Dr. Fei-Fei Li of Stanford has long emphasized the importance of human-centered AI design. “AI will augment human capability, not replace it,” she said in a 2018 Stanford HAI talk (Li, 2018).

Sidhu’s workflow illustrates augmentation. The models generate code, but he directs architecture, aesthetic tone, and system integration. – vibe coding plane & satellite tracking.

Computer scientist Ethan Mollick has written that generative AI enables “co-intelligence,” where humans and models collaborate iteratively (Mollick, 2023). Multi-agent orchestration operationalizes that concept.

Meanwhile, software engineering researcher Mary Shaw of Carnegie Mellon has noted that abstraction layers historically drive productivity leaps. “The history of software engineering is a history of raising the level of abstraction,” she observed (Shaw, 2010).

Vibe-coding may represent the next abstraction layer: intent as interface.

The Cultural Aesthetic of Spy-Tech

The dashboard’s design intentionally mimics classified systems. CRT scan lines, infrared overlays, and telemetry grids evoke defense imagery. Popular culture has long glamorized aerospace intelligence rooms.

The aesthetic resonates partly because flight and satellite tracking are publicly accessible yet feel strategically sensitive. Platforms like FlightRadar24 have demonstrated public appetite for live aviation maps.

By blending open APIs with cinematic rendering, Sidhu reframes public data as immersive narrative.

The approach raises questions. When consumer tools replicate military-style dashboards, does it democratize situational awareness or blur lines between hobbyist experimentation and surveillance aesthetics?

For now, it remains a creative demonstration. But it signals how easily advanced interfaces can be assembled.

Timeline of the Build

Though built rapidly, the project follows a structured arc:

PhaseActivity
ConceptualizationVoice-note brainstorming and UI sketches
Agent DeploymentLaunch of parallel AI agents
API IntegrationWiring flight and satellite data streams
VisualizationBuilding 3D globe and shader layers
Iterative DebuggingAgent-based testing and refinement
Final DemoPublic video showcase

Such compression of workflow would have required weeks or months in traditional development cycles.

Takeaways

  • Vibe-coding shifts programming from syntax execution to intent orchestration.
  • Multi-agent AI workflows allow solo developers to approximate full engineering teams.
  • Public flight and satellite APIs enable real-time aerospace dashboards.
  • Gemini 3.1 and Claude 4.6 complement each other in visualization and structured coding.
  • The project reflects broader trends in abstraction and AI-augmented software creation.
  • Spatial computing expertise enhances data storytelling and immersion.

Conclusion

I view this project less as a novelty and more as a signal. Software development has always advanced through abstraction: assembly to high-level languages, monoliths to frameworks, cloud to serverless. Now we move toward orchestration of machine collaborators.

Bilawal Sidhu’s real-time tracking dashboard embodies that transition. He did not eliminate coding. He reframed it. The locus of expertise shifted upward from syntax to systems thinking.

AI models did not independently invent the dashboard. They responded to guided intention. The creative direction, aesthetic cohesion, and architectural vision remained human.

As models grow more capable, the differentiator may not be who writes the most code but who directs intelligence most effectively. In that future, vibe-coding is not a shortcut. It is a skill.

FAQs

What is vibe-coding?
It is an AI-driven development style where natural language prompts guide models to generate and iterate on code.

Which models were used in the project?
Claude 4.6 handled geospatial logic and shaders, while Gemini 3.1 supported telemetry and 3D visualization.

What data powers real-time flight tracking?
Most systems rely on ADS-B aviation signals accessed through APIs like AviationStack or AirLabs.

Are satellite positions publicly available?
Yes. Services like Aviation Edge and N2YO provide orbital data through public APIs.

Does this replace traditional programming?
No. It augments development by shifting focus toward orchestration and high-level design.

Leave a Comment