2018 GSoC Ideas Page



What kinds of skills or experience should you bring?

  • Programming in JavaScript and/or systems programming in C/C++, Python or other languages.
  • Linux system development, especially input devices and/or X.Org
  • Experience with Google Earth environment, including KML or other geospatial formats.
  • Technical experience with other panoramic display or panoramic content systems.

What resources are available to help you?

Although we expect students will have enough hardware themselves (or through their school) to develop their project, in some circumstances we may be able to assist with loans of equipment. We can certainly arrange visits to Liquid Galaxy systems if one is in your area.

Enabling applications for Liquid Galaxy

Getting other applications to work on Liquid Galaxy is always valuable to the Galaxy community. A good example of an application we have one our list is the open source virtual globe Cesium. If you have experience coding with Cesium or are willing to learn, please talk to us.

Skills: developer experience with an app or engine that could benefit from working on Liquid Galaxy.

Enabling WebVR for Liquid Galaxy

VR in the browser is an exciting area of development for immersive visualisation. We can run a browser on each screen of a Galaxy cluster, a WebVR 'VRDisplay' that described the Galaxy display cluster would be an awesome addition to our software environment.

Skills: JavaScript and API's, experience with WebVR application building.

Improving Navigation & Control

The primary controller on Liquid Galaxy installations is the 3DConnexion SpaceNavigator, which is a fantastic controller but does take some getting used to. We are especially keen to see SpaceNavigator control implemented with HTML5 GamePad API and web-based 3D apps like Cesium or WebGL demos.

Skills: JavaScript, trigonometry.

Improved handling of panoramic content



Liquid Galaxy is a fantastic platform for viewing panoramic image and video content. Our current tools for viewing panoramic content can always benefit from new ideas and features. We are looking for students willing to work software such as Peruse-a-Rue and Pannellum.

Skills: a good understanding of panoramic imagery and projections, programming experience in whatever tool you think will get the job done!

Bringing other applications and system capabilities to Liquid Galaxy

Put on your thinking cap and come up with a novel application for immersive visualisation on Liquid Galaxy! Examples:
  • Multi-screen multi-user Google Hangouts. We'd love to see a multi-screen immersive video conferencing running on Liquid Galaxy using Google Hangouts or WebRTC.
  • Interactive games using Google Earth or Street View environment, geography quiz's, hide and seek games, such as GeoGuessr.
  • VJ'ing software development or integration, for which the Liquid Galaxy will be a most impressive platform.
  • Automated setup and tools for quickly calibrating offsets for the displays on the system.
Various skills required: JavaScript, Hangout API, HTML5, WebRTC.

New Control Devices

Most Liquid Galaxy's are controlled by a SpaceNavigator and a touchscreen. We'd like to see other input devices control Google Earth. Obvious examples include the Wii Remote, Microsoft Kinect, Android phones/tablets, using accelerometers, GPS, head and eye trackers or other novel input.

Panoramic Content Production

Because Liquid Galaxy uses commodity hardware, we're generally limited to flat display technologies. This means we're displaying multiple flat planar views arranged in a cylinder (rectilinear). However most panoramic content publishers use spherical or cylindrical (curvilinear) projections. Displaying this type of content on a Liquid Galaxy requires conversion to multiple planar views. We'd like to see this process automated as much as possible.

Another way of producing 360 degree panoramas and-or point clouds is structure from motion. Possible project in this field could be:
  • Dockerize OpenSfM or Bundler
  • Create simple web-gui for managing image sequences and results
  • Add point clouds viewer for CesiumJS or use any other web viewer for point clouds (optional)

System, Network and Caching Performance Monitoring

We'd like better insight into each level of the Liquid Galaxy stack, especially the multiple levels of caching. A detailed near-real-time performance monitoring solution could help diagnose bottlenecks and configure the Squid HTTP cache for better performance. Metrics from each system should be collected and displayed immediately, including disk usage, networking, CPU and GPU utilization, HTTP cache hits and misses, etc. What is an optimal cache size (or content age) for the google content?

Add or adapt Networked View Synchronisation to other Applications


Google Earth is certainly a "killer app" for the Liquid Galaxy platform. But there are many applications that could be easily enhanced, coordinating multiple instances rendering portions of a panoramic view. Here's a few ideas we have...
  • MPlayer has already been patched, enabling coordinated playback for PanoramicVideo. However the existing patch works only for video, not audio. Modify MPlayer to also send UDP master packets even if video frames are not available to be rendered ie. during audio file playback. Alternatively, modify VLC to account for bezels when using “-wall” filter for immersive ultra-widescreen movie display eg. Cinerama epics!
  • Document setups for commercial simulation apps that support multiple-machine clusters - such as FS X and X-Plane, surely there are some racing sims?
  • Enhance WebGL Samples to work on a Galaxy setup similar to how Gregg enhanced the WebGL Aquarium. An example WebGL candidate would be something like the Google Body browser. Can we develop a JavaScript support library that assists developers working in WebGL (eg. perhaps using Three.JS, SceneJS) too make their applications immersive? Convert eye-candy demo's like Rome to Liquid Galaxy, see Chrome Experiments. Additional webgl apps of worth - OpenWebGlobeNokia Maps


  • A multi-screen YouTube launcher. Synchronise playlists across all the displays. Perhaps when doing a video search show one video on each screen. Chrome/Firefox extension to open windows/tabs on specific LG screens may be useful here.

Touch Screen Control Enhancements

  • Application (Earth/Mplayer/Sauerbraten) “selection” buttons using xdotool to do window searches and map/unmap or similar.
  • Tour Control (not likely feasible).
  • Load/Unload KML from touchscreen.
  • Control standard Google Earth features eg. toggle layers and grid, Sun mode, etc.

Google Earth Networking & Collaboration Enhancements

  • A contributed script called "viewsyncrelay” can act as the recipient for the Google Earth ViewSync packets sent by the Google Earth master. The script broadcasts the packets to the slave nodes in a Galaxy setup. As a middle-man the script can potentially alter the values, execute scripts on the clients, collect statistics, trigger sound effects, etc. Help this script grow into a more functional and extensible tool. Must know Perl (or similiar) and be familiar with UDP network communication.
  • Connect several LG rigs together for shared virtual tours and Google Earth-based field trips. Can probably be achieved by adapting viewsyncrelay.pl and some ViewSync->KML->ViewSync glue.
  • Immersive multi-screen Google Plus 'Hangout' video conference with friends on a Liquid Galaxy! See NTT t-Room as an example. EVO or AccessGrid may be options. WebRTC and/or Hangout API may also help here.

GigaPan Viewer, on Adobe Air

  • Add UDP broadcast/receive to GigaPan Desktop Viewer
  • Update GigaPan Desktop to AIR 2.5 namespace so that it can be made into an Android .apk
  • Extend the AIR usage further to allow Android-controlled (or not) gaming on a Galaxy setup.

Liquid Galaxy System Deployment Automation

Presently Liquid Galaxy systems are complex to deploy, requiring several hours of one or more experienced Linux system administrator's time. This wiki does have installation instructions, but the process could benefit from better automation, testing and documentation. Features like GUI configuration, or automated, even dynamic personality assignment would put Liquid Galaxy much nearer the reach of enthusiasts.

Bring the desktop Google Earth user-interface experience to Liquid Galaxy

Liquid Galaxy setups are fantastic platforms for showcasing Google layers and datasets. However it is difficult for users to load their own KML, datasets and interact with some features of Google Earth eg. turning on/off specific layers. If LG was as easy to interact with as desktop Google Earth a whole community of education and scientific users would thank you! There are open-source tools that may help here eg. Synergy and SikuliInput Director (Windows only).

Random Ideas...

  • HowTo's for running up Liquid Galaxy on non-Linux platforms.
  • Trigger location-based sound effects using a database of geo-located audio for example, the British Library UK Sound MapGlobal Soundscape NetworkUrbanRemix, water splash when diving into ocean, sound of the surf when near the coast, etc. Is Pumilio (an open source soundscape manager) able to export KML which is useful in Google Earth?
  • Investigate potential for MumbleTeamSpeak or Ventrilo for surround/3d locating audio in rig as well as for inter-rig voice communication.
  • Construct a 3D model for calibration of LG rigs. Basically a cylinder with a test pattern, angle of orientation written around it.
  • Investigate getting Space Navigator working in Javascript. Possible path with 3Dx v10 drivers and Javascript joystick emulation.
  • Link with and develop a HowTo Guide for geography teachers & school computer clubs.
  • Further develop real-time in-world Google Earth avatars, currently prototyped using ViewSync->KML. Would be cool for classrooms.
  • Need some way of benchmarking Liquid Galaxy rig overall performance.
  • Investigate BoyGrouping for clustered displays when using the vvvv real-time vis toolkit.
  • Check multi-machine rendering with Most Pixels Ever and Polycode.



( Contents from main general github Liquid Galaxy repository )


Andreu Ibáñez

Andreu Ibáñez es coordinador de los Laboratorios TIC del Parque Científico Lleida: Liquid Galaxy, LleidaDrone LAB y Artificial Intelligence LAB Http://www.andreuibanez.com.