Archives For .Astronomy

One Week

March 2, 2015 — 1 Comment

Yesterday marked 5 years since I joined the Zooniverse team in Oxford, straight out of my PhD at Cardiff. It’s weird to say it but this week will be my final week here before I start a new role at Google, in London.

When I arrived at Zooniverse there were only two people here: Arfon Smith and Chris Lintott. Though there has always been a cloud of other researchers around the Zooniverse – they were the only only full time Zooniverse team. That changed a lot in the next 5 years!

The Zooniverse Team, May 2014

(Most of the) The Zooniverse Team, May 2014


I’ve never been one to fit in other peoples’ boxes, so Zooniverse suited me from the start. Unconventional, yet accessible; research, but not as we knew it. The Zooniverse has been a fantastic place to work. Indeed it still is. I’ve had the pleasure of building unique projects that have benefited astronomy and science. I’ve worked with remarkable researchers, developers, educators, and herders. It has been a lot of fun and I’ve been able to be part of its growth and evolution.

Over the years I have read many blogs and articles, usually written by someone leaving research, about how academia has a brain drain problem, or lacks a family-friendly environment, or can’t compete with industry. I have sometimes agreed, though usually quietly. Most of these pieces are dismissed by those left in academia, even if they are shared widely by them at the same time. I won’t be writing such a post Do I think academia is perfect? No. But no job suits everyone. Do I think that academia could do more for minorities, women, and families? Yes. But all jobs probably could. Being a postdoc has afforded me great flexibility with my time, and also given me the chance to travel and engage in awesome new ideas. It hasn’t given me stability though, and since I don’t want to be a professor, I’m not sure where it takes me as a career. I’d recommend it to everyone and noone at the same time. I’ve had a great time, but now it’s time to go. I’m terrified of course, but sometimes you have the make a giant leap when the opportunity presents itself.

Recycled Electrons and The Rewatch will both continue. The Rewatch will remain mostly unchanged, but you will hear less of me on Recycled Electrons – simply the result of time contraints. .Astronomy is also being taken care of, and I’ll blog about that separately. Rest assured though that #dotastro 7 and 8 are in hand.

It will be so sad to leave the Zooniverse, but I’m incredibly excited about Google. I’ll probably go quiet here for a while as I start my new job. I’m not gone though – just throwing myself into the new role, and meeting an exciting challenge head on. See you on the other side.


Hubot is an open source chatbot created by GitHub. It’s used by various companies, groups, and other techie types, to control systems, gather information, and put moustaches on things – all via chat interfaces. Hubot can be adapted to work via IM, GTalk, Twitter, IRC, and other platforms. ‘Chat Ops‘ is a growing trends, and because it is simple, and quite charming, I think it may stick around.

I’ve just finished an epic few days at the sixth .Astronomy event. This is my own conference series, and I’m gleefully exhausted from several days of talking, making, and hosting my favourite event of the year. More on that in a later post. During the .Astronomy 6 Hack Day (sponsored by GitHub in fact!) I worked on making an astronomical Hubot – which I’ve called ‘botastro‘ in honor of the #dotastro hashtag from .Astronomy itself.

@botastro exists only on Twitter (for now) and to interact you just tweet it. For example if you tweet

then @botastro will reply

You can can send multiple messages to the bot, but I have a growing list of other ideas too. Currently you can say things like:

  • @botastro sunrise Chicago
  • @botastro apod me
  • @botastro galaxify hello world
  • @botastro fun fact
  • @botastro moonphase
  • @botastro tell me about Jupiter
  • @botastro show me Perseus
  • @botastro gif dog
  • @botastro exoplanet me

Asking botastro to ‘galaxify’ something results in text made up of galaxies from Galaxy Zoo (thanks Stuart Lynn!) which is pretty


and asking it to ‘exoplanet me’ gives you an exoplanet from the catalogue (thanks to Dan Foreman-Mackey and Geert Berensten). The results you get when asking it to show you something or tell you about something are sourced from Stuart Lowe‘s lookUP service, and the space gifs come from Giphy.

These may be silly and fun, but more complex actions become possible – especially once I get a bit more used to Coffeescipt, the language this bot is written in.

@botastro is open source (on GitHub, naturally) and I’d love it if people wanted to add functionality. If you want to try, you’ll need to fork the repo, create a new script, and submit a pull request. Hubot is outlined here, and you can look at botastro’s other scripts for examples too.

Since 2008 I have been running .Astronomy, which is a meeting/hackathon/unconference that aims to be better than normal meetings and to foster new ideas and collaborations. It’s a playground for astro geeks that is more specific than a general hack day, but way more freeform that a normal astronomy meeting. At .Astronomy we have developed into an amazing community.

I know people that have gotten jobs because of .Astronomy, changed careers because of .Astronomy – or even left astronomy because of .Astronomy (in a good way!). We have evolved into an interesting group, with a culture and way of thinking that we take back to our ‘real’ jobs after each event.

In short: it works. Now I’d like to work out how to spread the idea into more academic fields. We’re looking for people in other research areas, such as economics, maths, chemistry, medicine and more.

Adler Planetarium

I have funding from the Alfred P. Sloan Foundation to bring a handful of non-astronomers to this year’s .Astronomy, in Chicago at the amazing Adler Planetarium (December 8-10). The aim is to meet up at the end, and discuss whether you think it could work in your own field, and what you’d need to make that happen. If you’re a researcher, who isn’t an astronomer, and you think this sounds great then that could be you! We have funding to pay for flights, hotels and expenses. It will be a lot of fun – and despite the astronomy focus of the event, I think most researchers, with a bit of tech experience, would get a lot out of it.

If you’re interested then fill out the short form at or email me on for more information. We are following a formal selection process, but we’re doing it very quickly and will decide by Nov 7th, to allow enough time ahead of the event to make travel plans and such. So don’t delay – do it now!

If you don’t think you’re the right person for this, then maybe you know who could be. If so, let them know and send them to for more information.

The latest issue of Astronomy & Geophysics includes an article by your truly about the GitHub/.Astronomy Hack Day at the UK’s National Astronomy Meeting in Portsmouth earlier this year.

The projects resulting from hack days are often prototypes, or proof-of-concept ideas that are meant to grow and expand later. Often they are simply written up and shared online for someone else to take on if they wish. This ethos of sharing and openness was evident at the NAM hack day, when people would periodically stand up and shout to the room, asking for anyone with skills in a particular area, or access to specific hardware.

Take a look here:

Martian Nyan Cat

Martian Nyan Cat


Executable papers are a cool idea in research [1]. You take a study, write it up as a paper and bundle together all your code, scripts and analysis in such a way that other people can take the ‘paper’ and run in themselves. This has three main attractive features, as I see it:

  1. It provides transparency for other researchers and allows everyone to run through your working to follow along step-by-step.
  2. It allows your peers to give you detailed feedback and ideas for improvements – or do the improvements themselves
  3. It allows others to take your work and try it out on their own data

The main problem is that these don’t really exist ‘in the wild’, and where they do they’re in bespoke formats even if they’re open source. iPython Notebook is a great way of doing something very much like an executable paper, for example. Another way would be to bundle up a virtual machine and share a disk image. Executable papers would allow for rapid-turnaround science to happen. For example, let’s imagine that you create a study and use some current data to form a theory or model. You do an analysis and create an executable paper. You store that paper in a library and the library periodically reruns the study when new data become available [2]. The library might be a university library server, or maybe it’s something like the arXiv, ePrints, or GitHub.

This is roughly what happens in some very competitive fields of science already – only with humans. Researchers write papers using simulated data and the instant they can access the anticipated data the import, run and publish. With observations of the Cosmic Microwave Background (CMB) it is the case that several competing researchers are waiting to work on the data – and new data come sour very rarely. In fact that day after the Planck CMB data was released last year, there was a flurry of papers submitted to the arXiv. Those who got in early, likely had pre-written much of the work and simply ran their code as soon as they had downloaded and parsed new, published data.

If executable papers could be left alone to scan the literature for new, useful data then they could also look for new results from each other. A set of executable papers could work together, without planning, to create new hypotheses and new understanding of the world. Whilst one paper crunches new environmental data, processing it into a catalogue, another could use the new catalogue to update climate change models and even automatically publish significant changes or new potential impacts for the economy.

I should be possible to make predictions in executable papers and have them automatically check for certain observational data and automatically republish updated results. So one can imagine a topical astronomy example where the BICEP2 results would be automatically checked against any released Planck data and then create new publications when statistical tests are met. Someone should do this if they haven’t already. In this way, papers can continue to further, or verify, our understanding long after publication.

SKA Rendering (Wikimedia Commons)

SKA Rendering (Wikimedia Commons)

This is high-frequency science [3], akin to high-frequency trading, and it seems like an interesting approach to some upcoming data-flow issues in science. The Large Hadron Collider (LHC), Large Synoptic Survey Telescope) LSST, and Square Kilometre Array (SKA) are all huge scientific instruments set to explore new parts o the universe and gathering huge volumes of data to be analysed.

Even the deployment of Zooniverse-scale citizen science cannot get around the fact that instruments like the SKA will create volumes of data that we don’t know what to do with, at a pace we’ve never seen before. I wonder if executable papers, set to scour the SKA servers for new data, could alleviate part of the issue by automatically searching for theorised trends. The papers would be sourced by the whole community, and peer-reviewed as is done today, effectively crowdsourcing the hypotheses through publications. This cloud of interconnected, virtual researchers, would continuously generate analyses that could be verified by some second peer-review process; since one would expect a great deal of nonsense in such a setup.

When this came up at a meeting the other day, Kevin Page (OeRC) remarked that we might just be describing sensors. In a way he’s right – but these are software sensors, built on the platform and infrastructure of the scientific community. They’re more like advanced tools; a set of ghost researchers, left to think about an idea in perpetuity, in service of the community that created them.

I’ve no idea if I’m describing anything real here – of it’s just an expression of way of partially automating the process of science. The idea stuck with me and I found myself writing about it to flesh it out – thus here is a blog post – and wondering how to code something like it. Maybe you have a notion too. If so, get in touch!


[1] But not a new one really. It did come up again at a recent Social Machines meeting though, hence this post.
[2] David De Roure outlined this idea quite casually in a meeting the other day, I’ve no ice air it’s his or just something he’s heard a lot and thought was quite cool.
[3] This phrasing isn’t mine, but as soon as I heard it, I loved it. The whole room got chatting about this very quickly so provenance was lost I’m afraid.

Yesterday was the Hack Day at the UK National Astronomy Meeting 2014 in Portsmouth. I organised it with my good friend Arfon Smith of GitHub, formerly Zooniverse. We wanted to try and start a new NAM tradition – it went well so maybe we did. I’m psyched that .Astronomy got to help make it happen – not just through my involvement, but the many .Astronomy alumni who attended!
Some of the hack projects have already started to appear online, such as Geert Barentsen, Jeremy Harwood, and Leigh Smith (Hertfordshire) who created a Martian Nyan Cat, which is planning to fly over the entirety of ESA’s Mars Express data archive in one continuous, two-day-long, flight. You also grab the code for Duncan Forgan’s beautiful ‘Music of the Spheres’ project, which sonifies the rhythms of planetary systems. Other projects are harder to place online, such as Jane Greaves’ knitted galaxy cluster – with dark matter contributed by many people during the hack day itself.

I spent much of the day working with Edward Gomez (LCOGT) on the littleBits Space Kit. littleBits is a modular system of circuits that let anyone try their hand at something that ordinarily requires a soldering iron. littleBits components may be switches, sensors, servos, or anything really, and they connect magnetically to create deceptively simple circuits that can be quite powerful.


For example you could connect an infrared sensor and an LED to make a device that flashes when you press buttons on your remote. Or you could use a microphone and a digital LCD display to create a sound meter. The littleBits components are sturdy enough to withstand being bashed about a bit, and simple, and large enough, to let you stick on cardboard, homemade figures, or anything else you find around the house. I found out about littleBits when I met their creator, Aya Bdier at TED in March. She is a fellow TED Fellow.

We decided fairly quickly to try and built an exoplanet simulator of some sort and ended up crating the littleBits Exoplanet Detector (and cup orrery). There were two parts to this: a cup-based orrery, and a transit detector.

The cup orrery consisted of a rotating ‘planetary system’ fashioned from a coffee cup mounted on a simple motor component – we only had hack day supplies to play with – and a central LED ‘star’. Some more cups and stirrers were required to scaffold the system into a stable state but it was soon working.

The transit detector used a light-sensor component that read out to both a speaker and a LCD numerical display – Ed refers to this as the laser display board. With a bit of shielding from the buffet’s handy, black, plastic plates the light sensor can easily pick up the LED and you can see the light intensity readout varying as the the paper planet passes in front of the star. It was awesome. We got very excited when it actually worked!

You might think that was geeky enough, but it gets better. I realised I could use my iPhone 5s – which has a high-frame-rate video mode – to record the model working in slow motion and allow us to better see the digital readout. We also realised that the littleBits speaker component can accept an audio jack and so could use the phone to feed in a pure tone, which made it much easier to hear the pulsing dips of the transits.

Finally, we had the idea to record this nice, tonal sound output from the detector and create waveforms to see if we could recover any properties about the exoplanets. And sure enough: we can! We built several different coffee-cup planetary systems (including a big planet, small planet, and twin planets) and their different properties are visible in their waveforms. Ed is planning a more rigorous exploration of this at a later date, but you can see and hear the large cup planet’s waveform below.

Waveform for Large Cup Planet

Waveform for Large Cup Planet

So if you want to try something like this, you only need the littleBits Space Kit. You can buy them online and I’d love to see more of these kits, and to see them in schools. I’m now totally addicted to the idea myself too!

GitHub Stickers

Thanks to Arfon for suggesting that we do this Hack Day together; to the NAM 2014 Portsmouth team for being so supportive; and to GitHub for sponsoring it – where else would we have gotten all the cups?!

Today is the start of the UK National Meeting in Portsmouth. I’ll be there tomorrow, and running the NAM Hack Day on Wednesday with Arfon Smith – which is going to be awesome. Today at NAM, the nation’s astronomers will discuss the case for UK involvement in the Large Synoptic Survey Telescope project – the LSST. The LSST is a huge telescope, and a massive undertaking. It will change astronomy in a profound way.

A photograph and a rendering mix of the exterior building showing the dome open and road leading away from the site.

A photograph and a rendering mix of the exterior LSST building, showing the dome open and road leading away from the site.

With every image it takes, the LSST will be able to record very a large patch of sky (~50 times the size of the full Moon). It will take more than 800 images each night and can image its* entire sky twice a week! Billions of galaxies, stars, and solar system objects will be seen for the first time and monitored over a period of 10 years. Crucially it will use it’s rapid-imaging power to look for moving or ‘transient’ things in the night sky. It will be an excellent tool for detecting supernova, asteroids, exoplanets and more of the things that move from night-to-night or week-to-week. For example, the LSST could be used to detect and track potentially hazardous asteroids that might impact the Earth. It will also help us understand dark energy – the mysterious force that seems to keep our universe expanding – by mapping the precise location of billions of galaxies.

I’ve recently become LSST:UK’s Public Data Coordinator – think ‘chief hacker’ if you prefer. The LSST’s unprecedented archive of data will be a resource we can tap into to create new kinds of public outreach tools, data visualisations, and citizen science. In recent years, we at the Zooniverse have pioneered citizen science investigations of data in astronomy**. The citizen science and amateur astronomy communities around the UK, and the world, will be able to access the amazing data that comes out of the LSST both through structure, Zooniverse-style projects but also in a more freeform manner. The potential for discovery will be on a scale we haven’t seen before. It’s very exciting.

The LSST is a public-private partnership and is led by the United States. The unique scientific opportunities presented by the LSST have led to the formation of a group of astronomers from more than 30 UK universities. We’ll be asking for funding from the Science and Technology Facilities Council to support UK participation in the project.

Spinnaker Tower from the Gosport Ferry

Spinnaker Tower from the Gosport Ferry

If you’re at NAM this week, then I’ve love to talk about LSST, hacking on data, and Zooniverse. On Wednesday you’ll find me in the Park Building, at the University of Portsmouth at the GitHub/.Astronomy NAM 2014 Hack Day. I’ll also be at the GitHub drink up on Tuesday night at The White Swan from 7pm – where you can enjoy some of the finest cask ales, draught beers and wines in Portsmouth – and GitHub are paying! More details at

* i.e. the sky visible from its location – not literally the entire sky
** We’ve now had more than 1 million volunteers pass through our digital doors.

TED 2014 has just ended here in Vancouver and I have finally now experienced an event I’ve heard a lot about for many years. I’ve watched TED talks online for as long as I’ve watched anything online and the real deal did not disappoint. Attending TED for the first time has been intense, wonderful, and dizzying and it was great to be here for it’s special, 30th anniversary year. Highlights from my experience are difficult to streamline into a blog post. So this is my best shot.


Each day at TED has presented numerous inspirational speakers and amazing ideas, curated into themes by the organisers such as ‘Liftoff!’, ‘Reshape’, ‘Hacked’, and ‘Onward’. These sessions took place in the event’s central venue; a custom-designed, wooden amphitheatre – built over only a few days before the event opened. These talks were usually 12 or 18 minutes long and sometimes formatted as interviews where relevant. The changing topics and formats were paced in a way that meant I rarely felt tired or restless – which is amazing since I’ve had about 5 hours sleep each night! You can already see some talks from these sessions including Colonel Chris Hadfield on conquering fear, and Edward Snowden on privacy – a talk delivered via a roving telepresence robot!

A secondary, slightly smaller venue housed the ‘All-Star’ sessions. These were totally packed-out as they consisted of many notable TED speakers from the past 30 years, each giving 4 minute talks to update or reflect on their work and ideas. Speakers here included a wide range of awesome folks such as Sir Tim Berners-Lee, Imogen Heap, Dan Gilbert, Jimmy Wales, Sir Martin Rees, and even General Stanley McChrystal. All of them had just 4 minutes, which kept the energy high, and the pace steady.


I was very happy to see that some of the best talks of the week were about science. Ed Yong (Nat. Geo. blogger) told us about parasites, Sara Lewis (Tufts) about fireflies and Andy Connolly (U. Washington) about the future of astronomy. All extremely well-crafted and well-delivered talks about often complex topics. Hugh Herr’s (MIT Media Lab) outline of the future of bionics and prosthetic limbs was not just a tale of amazing science, but also included a live performance by a ballroom dancer who lost her leg in the Boston Marathon bombings and can now dance again thanks to the help of his MIT lab. A perfectly ‘TED’ moment and a moving thing to witness.

There is also a special place in my heart for some of the technology speakers, including Margaret Gould Stewart, who talked about designing and changing Facebook and its impact on user behaviour; and Del Harvey who won everyone over with her sardonic delivery of a talk about managing the stranger side of Twitter. Keren Elazari delivered a moving lesson in why hackers may keep us all safe and keep governments honest – a talk that will be timely if posted quickly to, and which proves how refreshing and important it is to have women talk about tech. Something that is all too rare.

Supermodel Geena Rocero came out as transgender live on the TED stage; Mellody Hobson gave a challenging and optimistic talk on race; and Mark Ronson gave a talk/performance about remixing and reclaiming music – partly involving a live remixing of other TED talks within his own. It was pretty genius – though it may have been lost on a large chunk of the audience.

The average age of TED attendees is predictably quite high – and I think that must be part of the thinking behind the TED Fellows program – the whole reason I’m here with about 20 other folks from around the world. By supporting its growing Fellows community, TED is creating new connections and networks, but also injecting a chunk of people into the conference that otherwise would not be able to attend. As part of our participation, all the fellows give their own short talk at the opening of the event.

The prospect of giving my own 4 minute talk on Monday was a big part of my life leading up to the conference. 4 minutes is not a long time, and that fact seemed only to amplify the preparation required, and the intensity of my nerves. I felt shaky and sick as I walked into the lights on Monday morning, but once there, a strange calm fell upon me and I simply delivered my talk. My intense preparation suddenly seemed like a wise investment, and although I can’t say I relaxed, I definitely enjoyed it. Those 4 minutes flew by in the end.

I used my talk to highlight the wonderful work we do at the Zooniverse, and framed what we do in the context of ‘big data’ in science, and in discoveries that are waiting to be made if we allow the public access to our data. I think it went well, and I’ve certainly had many attendees and journalists ask me about in the days since. Not all the Fellows talks go online – but we’ll each get to see our own eventually. They edit them and send them out in the weeks and months to come.

The TED Fellows programme also provides coaching, mentorship, training and a bunch of other amazing experience and advice too. I can’t recommend it highly enough. A huge thank you goes to Tom Reilly, Shoham Arad, Sam Kelly, Corey Mohr, Patrick D’Arcy and the whole TED Fellows team. I’m so excited about the collaborations and ideas being generated between the group and what we can do in the future.

Giving the talk was unforgettable, and attending TED as been a dream come true. I am feeling motivated and inspired, but most importantly I’ve made lots of new connections and contacts for projects to work on in the near future. For now though we have one more engagement: a farewell dinner with the rest of the TED Fellows. Then it will be time to go back to Oxford and resume regularly scheduled programming. After .Astronomy I always get the .Astro blues and I can tell it will be the same for TED, but it is time to head home and see how I can take all these ideas and actually do something with them.

.Astronomy 5: What’s Next?

September 20, 2013 — 3 Comments

The .Astronomy 5 Unphoto – Credit: Demitri Muna

As the fifth .Astronomy came to a close on Wednesday, I felt as I always do at the end of these meetings: tired, emotional and super-excited. It’s hard to explain the energy at these events. There is something almost magical in the air as the participants ‘click’ (usually about an hour in) and then begin talking, making and doing great work.

.Astronomy is about actually doing something. As Kelle Cruz and I remarked yesterday – we like ‘people that do shit’. At .Astronomy you feel that if someone has an idea we should just all try and make it happen. It could be the best thing ever, and failure is just a chance to learn. It’s not a common attitude in astronomy and it’s certainly difficult for many early-career people to think that way.

I’ve always been lucky. My PhD supervisor was very willing to let me try crazy things (he let me get distracted by creating .Astronomy for a start!). At the Zooniverse we have spent years now, just pushing code live and making new things. They’re not always perfect, but we learn every time and we have left a trail of marvellous creations on the way. Each new thing learns from the last.

We also absorb the ideas of others quickly, and encourage collaboration with new people. It’s this approach that led to the creations of some of our most interesting projects recently, such as Snapshot Serengeti, the Andromeda Project and Space Warps.

During his Keynote talk Tony Hey (Microsoft Research) showed a quote I’ve not seen before.

“If you don’t like change, you’re going to like irrelevance even less.” – General Eric Shinseki, retired Chief of Staff, U. S. Army

I think I might put this on my wall. It sums up perfectly how I see much of science and could easily be the motto of .Astronomy. Tony’s keynote was brilliant BTW and you can see it here. Tony spoke about the Fourth Paradigm and told the tale of how the availability of astronomical data led to the SDSS SkyServer, which sparked the creation of Galaxy Zoo, which sparked the Zooniverse. In a way, .Astronomy was partly sparked by Galaxy Zoo too.

The folks at .Astronomy have built many projects that embrace the web fully, with an ethos of sharing and participation. These projects are changing the way astronomy and outreach are done: Chromoscope, 365 Days of Astronomy, AstroBetter, Astropy, astro,js, and the Seamless Astronomy groups ‘Bones of the Milky Way‘ paper; there are more but these are excellent examples.

So after .Astronomy 5 I’m left wondering where to take it next in order to facilitate more of these projects. There were 40 hack day pitches at this year’s event. There were so many hack day reports the follow day (the 2-3 minute slots where people show off their results) that we had to over run into coffee and use up most of lunch time too. Many of those hacks will, I hope, soon be appearing on the .Astronomy blog when people have time to write them up. Some of them are already popping up on GitHub (e.g. d3po).

The other wonderful thing about the meeting was how it once again encouraged genuine debates and discussions that sound like they might actually lead to change. The unconference sessions on diversity in astronomy went beyond the usual format and did not fall in to the trap of collectively preaching to the choir. A document has been drafted with actionable ideas. I hope it is revisited soon. Similarly sessions of the future of academic publishing were not bogged-down in the usual complaints but actually became a real debate about practical things we could do differently.

There were also highly informative unconference sessions that would not have happened elsewhere; enthusiastic tutorials of Astropy, Authorea and the merits of combining noisy classifiers are all jumping to mind. These meetings organically emerge from the crowd at .Astronomy and they’re, interactive, productive, and brilliant.

So as I ponder on the future of .Astronomy (I’d love your thoughts) I’ll leave you with some of the wonderful video hacks that were produced at this year’s event. Don’t Call Me Colin is a song about a sad exoplanet from Niall Deacon, Emily Rice, Ruth Angus and others. There is also a timelapse of .Astronomy itself in action from Amanda Bauer.

Thank you to everybody who took part, gave their time to talk, help organise the event; and who followed along online. It was a great meeting and I’m already looking forward to the next one. Long live #dotastro!

During the Perseid meteor shower, I blogged a video of a bright meteor taken by astrophotographer Mel Gigg. He had shared the image fairly widely and soon others noticed that they had caught the exact same shooting star themselves. In fact four observers had caught the same object as it flew into the atmosphere above Southern England, three of them have shared their images online (Wayne Young, Mel Gigg and Steve Night).

Credit: Wayne Young

Credit: Wayne Young

Credit: Mel Gigg

Credit: Mel Gigg

Credit: Steve Knight

Credit: Steve Knight

Look carefully and you’ll see that these images show the same streak of light but against drastically different star fields. That’s because meteors are high above the ground and visible across a large area. Due to the effect of parallax, they appear to shift relative to the night sky for different observers. In the extremes, an observer underneath the meteor, would see it go directly overhead, whereas others might see it from the side, where it would appear to fly nearer to the horizon. In this case it was seen by four people from different positions so they each had a different angle on the meteor and a different backdrop of stars.

In a wonderful example of citizen science, Wayne Young (one of the four photographers) took the four images and the lat/long data of each observer’s location, and created a 3D model of this particular Perseid’s path. You can see it below modelled in Google Earth (KML file here).

To create this he’s triangulated the path of the meteor by comparing each of the four images to one another. Given the capabilities of computer vision tools and, I wonder how much of this could be automated. It wouldn’t be hard to search Flickr for shooting stars seen at similar times and locations maybe we can scrape more trajectories automagically? This might be an ideal hack day project for .Astronomy. To plot many of these paths on top of each other would be interesting.

It’s fun to surmise that, given this Perseid’s path, it would have touched down in a field in North Devon. Good job it most likely disintegrated long before then.