Keogramist: Explore 1400 Nights of Aurora on a Phone
AGU Fall Session 2020
Author contact: email@example.com
The aurora research community has made full-colour all-sky camera (ASC) imagery available online for public outreach and education, providing a deep resource of archival material. Concurrently, modern mobile devices like smartphones have improved in capability, able to present interactive video and 3D content directly in web browsers. This work represents an example of citizen science in which an individual from another field repurposed, or "MacGyvered", existing AuroraMAX ASC timelapse archive, to make over 1400 nights of aurora activity more engaging and immersive for users of mobile devices. The resulting "Keogramist" web app is designed to be a visually intuitive gateway to explore and experience auroral activity. An "enhanced stacked keogram" interface organizes aurora activity temporally, and enhances visible features using minor image processing. Users can choose any night's keogram and replay and interact with the corresponding timelapse ASC video from a first-person perspective, emulating what a viewer would experience standing under the aurora and looking at the sky around them. Adjustable parameters can accommodate a range of existing ASC resolutions, fields of view, and replay rates, allowing other ASC archives to be explored in the same way. Scientific and public users alike may find the format beneficial as it provides quick, mobile review for events of interest. This platform may enable citizen scientists to interact with large amounts of data for image and feature characterization.
Motivation and Background
A 'Human' viewpoint of aurora video
- Help as many people as possible explore existing aurora borealis all-sky camera (ASC) timelapse archive using common platforms and low-barrier technology
- Make a more intuitive visual search interface to rapidly locate nights with activity of interest and filter out nights with poor viewing conditions
- Create an immersive, first person viewpoint, as if the user were standing in the same place as the ASC
- Allow the user to look around this virtual environment, access specific moments of the nights activity, and share these moments with others using a standard URL
An 'accessible' visual index of night sky activity: keograms
Implementation and performance
Keogram & metadata generation
Offline extraction of timecodes and keogram images directly from existing MP4 video was successful
- Generating Keograms from compressed MP4 video (the format publicly available from the AuroraMAX archive) was still found to be a high enough quality for visual inspection and differentiation of substorms, weather, red, green, and magenta aurora colours
- Extracting timestamp data using a machine vision library (PyTesseract) from these MP4 files was found to be reliable if the algorithm discarded corrupt results and simply tried adjacent video frames
- Using a series of command-line driven Python scripts allowed the process to be flexible enough to run on a schedule and via automation files
- Using JSON data format for all metadata in a single file was found to compress well for across-web loading, and allowed organic extension of metadata.
- Client devices were found to have more than enough memory to allow local parsing and search of this data, even for 1400 night-entries, each with 8 or more variables (keys)
- Keogram interface could load efficiently over mobile data, caching of images worked on standard web platforms, making revisits quick to load
- Progressive loading of keogram image files allowed the interface to remain responsive, breaking approximately 35MB of keogram resources into 20-30KB sized chunks
- Time-aligning 3-hour Kp as a colour-coded heat-map appeared intuitive and visually highlighted nights of higher geomagnetic activity
- Scaling and aligning my time and local midnight allowed comparison of activity around midnight, and differentiation between longer and short nights (and videos)
- Vertically stacked, horizontally aligned keograms was suitable for all screen sizes and platforms tested (since vertical scrolling is a familiar web interaction for progressive reveal of information) in contrast to other arrangements such as horizontally stacked
Hardware-accelerated 3D rendering in browsers was suitable for real time re-projection of ASC video
- The app achieves between 30 and 60 frames per second performance on mobile and desktop computers (this is independent of the frame rate of the ASC video playback)
- Works consistently on current modern, major browsers, due to web standards compliance for hardware-accelerated WebGL 3D
- Buffering, playback rate and seeking behaviours were slightly different across browsers, but taking this into account made a consistent interface
- Dynamically selecting video file based on screen resolution increased loading performance without compromising perceived quality (i.e. 1080p resolution for large screen, and 480p for smaller mobile screens)
- Projecting the video as a texture on a geometry (i.e. a hemisphere) provided the desired effect of a 'planetarium' or 'skydome' view
- Warping the skydome mesh's vertices was an effective way of compensating for various non-ideal ASC lens distortion effects, especially near the extreme horizon
- Most ASC lens projection transforms tested conformed approximately to the 'equidistant' model, i.e. r = f*theta, where video pixel distance from the center of the camera sensor (and therefore video frame) f has a linear relationship to angle theta from the zenith
- The extreme edges (horizon line) of the ASC video capture tended to be occluded by local buildings, foliage, and even light pollution or protective dome distortion, so a virtual background forest was added to obscured these artifacts and maintain the immersive illusion
- Adding information such as compass and time data as unobtrusive elements also helped maintain the immersive illusion
- Although most ASC timelapse is in the 4-10 second cadence, testing with 24 frame-per-second real time video was also performant, and provided some addition 'reality' due to smoother, real-time motions
- Although 1080p is a standard 'high' resolution video, 4K resolution would be more suited to skydome applications (i.e. 4 times the pixel count)
- Rather than re-encode existing timelapse video to fit the AuroraDome viewer, it was found to be more suitable to have the viewer do all the work in real time, since WebGL is optimized to map video textures onto the geometry of the virtual skydome using the hardware graphics acceleration and dome vertex locations
- Building the AuroraDome viewer as a separate web app that receives parameters from the web address URL parameters makes sharing a specific moment in a specific night (and camera and replay parameters) much easier, and independent of any separate indexing system. This makes AuroraDome suitable for any suitable ASC video data of appropriate format.
What did we learn from a user survey?
Majority of surveyed users planned to use the app again. Users often ignore instructions, just want to start searching and experiencing. About 50% use smartphones.
- Users from the Alberta Aurora Chaser Group and the Royal Astronomical Society of Canada (Ottawa Chapter) were surveyed about their experience testing the app.
- Users were evenly divided between smartphone and desktop platforms, which agrees with other published surveys
- Users were evenly divided in frequency of seeing aurora in person, from 'never' through 'often'.
- The survey responses indicated they were able to use the Keogramist index and AuroraDome viewer to find and experience interesting aurora video, and that they would use the app again.
- A significant number of users did not read the introduction or instructions, leading to some confusion later on (i.e. asking for features that were discussed in the onboarding) . This indicated that any guidance or instructions should be inline and cue the user, rather than front load context or direction
Researchers, citizen scientists, and enthusiasts used the app and provided useful feedback
- Web-based applications proved robust and performant on a variety of mobile consumer computing platforms using standard web browsers. No special server-side software or client side platform specific "App" were required
- Researchers can also use this interface for quick mobile review of data, particularly for events of interest, e.g. substorms which are easily identified.
- It was found that minor image processing (saturation enhancement) enhances the visibility of aurora features in keograms features, specifically different emission wavelengths). This is documented to maintain scientific integrity.
- The app has been accessed hundreds of times in the starting February 2019 by aurora chasers and citizen scientists across the globe, with specific interest coming from the Royal Astronomical Society of Canada members, and Facebook groups such as Alberta Aurora Chasers.
Implications & next steps
This extensible platform can work for a variety of all-sky cameras and is a useful tool for amateur and professional audiences. Next steps include more cameras, more search filters, ability to tag specific events.
- This work represents an example of citizen science, in which an individual from one field (in this case computer vision and mobile web architectures) mades a new contribution to a different domain (visual aurora research)
- This platform may enable citizen scientists to interact with vast amounts of data for image and feature characterization, similar to the Zooniverse platform efforts.
- The work is being highlighted by the Aurorasaurus citizen science project to guide the applicability and broadcast to interested users
- This interface has also shown promise for additional filters, for example adding additional filtering of nights by Kp or computer-vision based ranking metrics. These sorts of features have also been requested in user feedback
- Within reasonable constraints, the AuroraDome viewer is agnostic of the specific ASC video source, framerate, resolution, and projection type, for example real-time video
- The interface has demonstrated a performant and usable platform on mobile and desktop browsers, without the need to visit and App store or install software or deploy it to 'App stores'
What is the most MacGyver aspect of this work?
This work puts existing all sky camera timelapse and realtime video archives in a whole new perspective. There is no need to go out and get new data, all we need is the existing timelapse video. The timestamp can even be extracted directly from the video frames in the absence of any other context. This application can bring citizen scientists and timelapse datasets into direct contact, and we have seen the success citizen scientists have had in broadening aurora research.
 Conversation with Dr. Don Hampton, University of Alaska, Fairbanks, May 21, 2020
 Online survey directed to Alberta Aurora Chasers Facebook Group and Royal Astronomical Society of Canada (Ottawa Chapter) November 2020
Special thanks to
Dr. Eric Donovan, Dr. Emma Spanswick, and Darren Chaddock at the University of Calgary Auroral Imaging Group
Dr. Don Hampton at the University of Alaska, Fairbanks, Geophysical Institute
The Royal Astronomical Society of Canada (RASC) Ottawa Chapter
The Alberta Aurora Chasers Group
Laura Brandt of Aurorasaurus
Alan Dyer of RASC Calgary Chapter