Since 2008 I have been running .Astronomy, which is a meeting/hackathon/unconference that aims to be better than normal meetings and to foster new ideas and collaborations. It’s a playground for astro geeks that is more specific than a general hack day, but way more freeform that a normal astronomy meeting. At .Astronomy we have developed into an amazing community.

I know people that have gotten jobs because of .Astronomy, changed careers because of .Astronomy – or even left astronomy because of .Astronomy (in a good way!). We have evolved into an interesting group, with a culture and way of thinking that we take back to our ‘real’ jobs after each event.

In short: it works. Now I’d like to work out how to spread the idea into more academic fields. We’re looking for people in other research areas, such as economics, maths, chemistry, medicine and more.

Adler Planetarium

I have funding from the Alfred P. Sloan Foundation to bring a handful of non-astronomers to this year’s .Astronomy, in Chicago at the amazing Adler Planetarium (December 8-10). The aim is to meet up at the end, and discuss whether you think it could work in your own field, and what you’d need to make that happen. If you’re a researcher, who isn’t an astronomer, and you think this sounds great then that could be you! We have funding to pay for flights, hotels and expenses. It will be a lot of fun – and despite the astronomy focus of the event, I think most researchers, with a bit of tech experience, would get a lot out of it.

If you’re interested then fill out the short form at http://bit.ly/dotastromulti or email me on rob@dotastronomy.com for more information. We are following a formal selection process, but we’re doing it very quickly and will decide by Nov 7th, to allow enough time ahead of the event to make travel plans and such. So don’t delay – do it now!

If you don’t think you’re the right person for this, then maybe you know who could be. If so, let them know and send them to http://dotastronomy.com/about/astronomy-6-multidisciplinary-program/ for more information.

Screenshot 2014-09-09 16.20.32

I’ve been called a lot of things but ‘rebel’ hasn’t come up too often. Not that I mind. As part of a Mazda campaign, I’m being highlighted as one of four TED Fellows* who are ‘Mazda Rebels’. The other three are thoroughly impressive and I recommend you take a look. There’s an online vote where the pubic help chose whoever they think the deserves a Mazda grant to help their project.

My video can be found here. It’s lovely and I really enjoyed making it. It nicely describes my work with Zooniverse (special guest starring Brooke Simmons as a Snapshot Serengeti volunteer!) in a fun, accessible way. We had a laugh creating it, and they have kept many of the out-takes in the video, which I rather enjoyed.

If I win the vote then I’ll be using the money to kick-start the Zooniverse’s efforts in disaster relief with a ‘First Responders’ project. Think Milky Way Project but with aerial photos of recent disasters, with volunteers helping locate resources, danger, and people. This is something several of us at Zooniverse HQ are very keen on, and using the power of crowdsourcing in realtime after a disaster makes a lot of sense.

I highly recommend that you take a look at all four videos and vote for your favourite here: https://www.mazdarebels.com/en-gb/content/four-inspiring-ted-fellows-one-mazda-grant/

* Applications are still open to become a 2015 TED Fellow – I can highly recommend it!

ttfnrob:

Really pleased to make my Milkman app available for all MWP users :)

Originally posted on The Milky Way Project Blog:

I’ve been building a new app for the Milky Way Project called Milkman. It goes alongside Talk and allows you to see where everyone’s clicks go, and what the results of crowdsourcing look like. It’s open source, and a good step toward open science. I’d love feedback from citizen scientists and science users alike.

Milkman

Milkman is so called because it delivers data for the Milky Way Project, and maybe eventually some other Zooniverse projects too. You can access Milkman directly at explore.milkywayproject.org (where you can input a Zooniverse subject ID or search using galactic coordinates), or more usefully, you can get to Milkman via Talk – using the new ‘Explore’ button that now appears for logged-in users.

Clicking ‘Explore’ will show you the core view of Milkman: a display of all the clicks from all the volunteers who have seen that image and the current, combined results.

Screenshot 2014-09-09 09.14.38

Milkman 2

Milkman is a live, near-realtime view of…

View original 180 more words

daily_Zooniverse

‘Something awesome from the Zooniverse every day’ was the tagline that we came up with, almost a year ago, for a new Zooniverse blog: Daily Zooniverse. Grant Miller had recently arrived to work at Zooniverse HQ in Oxford and I had a todo list of things I’d always wanted to try but hadn’t found the time for. The Daily Zooniverse was right at the top.

The Zooniverse has spawned more than 30 citizen science projects, generated almost 100 peer-reviewed academic publications, and engaged more than one million people! Surely we had the capacity to share one cool thing every day? That was the challenge I laid at Grant’s feet last year and he has risen to it. Somehow, for the past 359 days, Grant has managed to post something (anything!) Zooniverse-related to the blog at daily.zooniverse.org.

Team birthdays, project status updates, suggested projects, and galaxy of the week are some examples of the blog’s regular features. There are the new projects that launch, the cool things the community find on Talk, and the awesome finds that just appear from seemingly nowhere. I love following this blog because it adds little bit of Zooniverse into my RSS feed each day. I often see things I didn’t know about myself!

Screenshot 2014-09-08 08.53.32

Congratulations to Grant on the blog’s birthday this week! Find the blog at daily.zooniverse.org or follow it via RSS, Twitter, Facebook, G+, and Tumblr.

Screenshot 2014-09-04 10.33.03

This Month’s edition of Wired (UK) includes a feature article about citizen science and crowdsourcing research. It has interviews with yours truly, as well as many lovely people from the citizen science crowd, including buddies Chris Lintott, Kevin Schawinski, and Amy Robinson.

It also has notes about my new collaboration with fellow TED Fellow Andrew Bastawrous and our plans to use the Zooniverse to help cure blindness around the world. As you can imagine I’m pretty psyched about that! You can watch Andrew’s great TED talk below for more about his work.

 

The article is written by João Medeiros and you can find it in Wired UK either physically or via their many digital apps. It will be online at a later date. The only correction I feel the need to state is that more than 400,000 people have taken part in Galaxy Zoo – not 4,000 as it says in the article!

The latest issue of Astronomy & Geophysics includes an article by your truly about the GitHub/.Astronomy Hack Day at the UK’s National Astronomy Meeting in Portsmouth earlier this year.

The projects resulting from hack days are often prototypes, or proof-of-concept ideas that are meant to grow and expand later. Often they are simply written up and shared online for someone else to take on if they wish. This ethos of sharing and openness was evident at the NAM hack day, when people would periodically stand up and shout to the room, asking for anyone with skills in a particular area, or access to specific hardware.

Take a look here: http://astrogeo.oxfordjournals.org/content/55/4/4.15.full?keytype=ref&ijkey=kkvGWSg3ABbIy5S

Martian Nyan Cat

Martian Nyan Cat

I got into a conversation recently about how some astronomical photos can totally change your whole perspective of yourself and your place in the Universe. There’s several images that come to mind right away – here are my own favourites:

1. The Milky Way (from a very dark location)

Milky Way

Seeing the night sky from a dark site is something most people don’t do very often, now that most of us live in cities. The vision of the Milky Way overhead can be startling, and a pair of binoculars make it more so; revealing that its delicate structure is made of millions of stars. This long-exposure photo of the dust lanes in our galaxy [1] is our first image that can really change your perspective on yourself and your place in the cosmos.

2. Earthshine on a crescent moon

Young Crescent Moon with Earthshine

When the Moon is just a thin crescent in the evening sky you can often see the rest of its face, dimly lit, and slightly reddened. This part of the Moon is not being illuminated by the Sun, like the crescent shape itself, but rather by the reflection of light from the Earth where the Sun has not yet gone down over the horizon. You’re seeing other people’s daylight, bounced back at you from around the world [2][3].

3. Aurora and lightning from the ISS

Sometimes a change in perspective can be quite literal – as with this video of the Earth seen from the International Space Station. The green structures are aurora- the Northern Lights over Canada in this case. You can also catch the occasional flash of lightning. This time-lapse is haunting and shows you a view you could probably never otherwise see.

4. M31 compared to a full moon

m31abtpmoon

The Andromeda Galaxy is our nearest neighbouring galaxy and can be seen as a faint fuzzy patch in the Northern Sky. What is amazing though, is to realise that in fact it is quite a large object – bigger than our own Moon in our sky. Out eyes just don’t see it very well! Long-exposure images show just how big it really is. Combine this with the fact that it is 200 million light years away [4] and you begin to realise that the galaxy next door is truly enormous. It’s about the same shape, size, and type as our own Milky Way too. So we will look pretty similar to anyone looking up at the sky from a planet in the Andromeda galaxy.

5. Earth from Saturn (and other places)

PIA17172

There are perhaps no images quite as humbling and shifting as the set of images we would probably call the ‘pale blue dots’. These are the small set of mages of the Earth from far, far away taken by the robots we have sent out into the Solar System. Voyager 1 took one in 1990 from 4.7 billion light years away; Cassini has taken more than one from the Saturnian system (like the one above); a few have been taken from Mars too. All of them show the Earth as just a pixel or so across: encompassing all of humanity, the world, and all life as we know it into a teeny tiny speck against the cosmos.

6. Orion’s Proplyds

Orion_Nebula_proplyd_atlas

These dark blobs hidden within the star-forming complex of the Orion nebula are known as proplyds – or protoplanetary disks. These are embryonic solar systems in the making. Each of these blobs is far larger than our own Solar System (they get smaller as they evolve into spinning orbits) which gives you some idea as to how large the Orion Nebula is in total. We were once shrouded in such a dusty blob ourselves – though long before the Earth formed.

7. The Sloan Great Wall

SUTU_59

The largest surveys of galaxies reveal a structure in the Universe so vast that is practically beyond comprehension – but let’s try anyway shall we? The Sloan Great Wall is a filament of galaxies, snaking through the Universe that appear to be physically connected to each other – bound by gravity. The ‘wall’ is 1.38 billion light years across. That’s 1/67th of the observable Universe! When light is emitted on one side it doesn’t reach the other end for 1.38 billion years. It is 1,600 times a long as the distance between the Milky Way and Andromeda. I told you it was hard to imagine.

8. Apollo 8 on Christmas Eve 1968

AS8-14-2383HR

I thought it would be good to end on something a little closer to home. On December 24th 1968 astronauts Bill Anders, Jim Lovell, and Frank Borman were the voices heard on one of the most-watched television broadcast of all time. As they read passages from the Bible’s Book of Genesis, they broadcast a grainy image of the Earth, as seen from the orbit of the Moon. The world watched themselves from space for the first time, and saw the Earth as a singular marble, set against the deep black of space. The image has since been remastered and still represents an era, and a moment in human history, that many find totally perspective changing. A symbol of a race of beings from a tiny planet, venturing outward to explore space and the worlds beyond their own. Remarkable.


[1] I recently had my first go at some proper astrophotography from a dark site. My target was the Milky Way and the result was this image of the dust lanes of our galaxy toward the centre of the galaxy. I’m pretty happy with it for a first go.

[2] This effect can also be seen on other moons around other planets and is generically called ‘Planetshine‘.

[3] This also serves as a good reminder that there is a part of the Moon we never see – the far side – which is lit by the Sun, but just never seen from Earth.

[4] That distance gets smaller all the time, and Andromeda will actually collide with us in about 4 billion years.

publications

Executable papers are a cool idea in research [1]. You take a study, write it up as a paper and bundle together all your code, scripts and analysis in such a way that other people can take the ‘paper’ and run in themselves. This has three main attractive features, as I see it:

  1. It provides transparency for other researchers and allows everyone to run through your working to follow along step-by-step.
  2. It allows your peers to give you detailed feedback and ideas for improvements – or do the improvements themselves
  3. It allows others to take your work and try it out on their own data

The main problem is that these don’t really exist ‘in the wild’, and where they do they’re in bespoke formats even if they’re open source. iPython Notebook is a great way of doing something very much like an executable paper, for example. Another way would be to bundle up a virtual machine and share a disk image. Executable papers would allow for rapid-turnaround science to happen. For example, let’s imagine that you create a study and use some current data to form a theory or model. You do an analysis and create an executable paper. You store that paper in a library and the library periodically reruns the study when new data become available [2]. The library might be a university library server, or maybe it’s something like the arXiv, ePrints, or GitHub.

This is roughly what happens in some very competitive fields of science already – only with humans. Researchers write papers using simulated data and the instant they can access the anticipated data the import, run and publish. With observations of the Cosmic Microwave Background (CMB) it is the case that several competing researchers are waiting to work on the data – and new data come sour very rarely. In fact that day after the Planck CMB data was released last year, there was a flurry of papers submitted to the arXiv. Those who got in early, likely had pre-written much of the work and simply ran their code as soon as they had downloaded and parsed new, published data.

If executable papers could be left alone to scan the literature for new, useful data then they could also look for new results from each other. A set of executable papers could work together, without planning, to create new hypotheses and new understanding of the world. Whilst one paper crunches new environmental data, processing it into a catalogue, another could use the new catalogue to update climate change models and even automatically publish significant changes or new potential impacts for the economy.

I should be possible to make predictions in executable papers and have them automatically check for certain observational data and automatically republish updated results. So one can imagine a topical astronomy example where the BICEP2 results would be automatically checked against any released Planck data and then create new publications when statistical tests are met. Someone should do this if they haven’t already. In this way, papers can continue to further, or verify, our understanding long after publication.

SKA Rendering (Wikimedia Commons)

SKA Rendering (Wikimedia Commons)

This is high-frequency science [3], akin to high-frequency trading, and it seems like an interesting approach to some upcoming data-flow issues in science. The Large Hadron Collider (LHC), Large Synoptic Survey Telescope) LSST, and Square Kilometre Array (SKA) are all huge scientific instruments set to explore new parts o the universe and gathering huge volumes of data to be analysed.

Even the deployment of Zooniverse-scale citizen science cannot get around the fact that instruments like the SKA will create volumes of data that we don’t know what to do with, at a pace we’ve never seen before. I wonder if executable papers, set to scour the SKA servers for new data, could alleviate part of the issue by automatically searching for theorised trends. The papers would be sourced by the whole community, and peer-reviewed as is done today, effectively crowdsourcing the hypotheses through publications. This cloud of interconnected, virtual researchers, would continuously generate analyses that could be verified by some second peer-review process; since one would expect a great deal of nonsense in such a setup.

When this came up at a meeting the other day, Kevin Page (OeRC) remarked that we might just be describing sensors. In a way he’s right – but these are software sensors, built on the platform and infrastructure of the scientific community. They’re more like advanced tools; a set of ghost researchers, left to think about an idea in perpetuity, in service of the community that created them.

I’ve no idea if I’m describing anything real here – of it’s just an expression of way of partially automating the process of science. The idea stuck with me and I found myself writing about it to flesh it out – thus here is a blog post – and wondering how to code something like it. Maybe you have a notion too. If so, get in touch!

———-

[1] But not a new one really. It did come up again at a recent Social Machines meeting though, hence this post.
[2] David De Roure outlined this idea quite casually in a meeting the other day, I’ve no ice air it’s his or just something he’s heard a lot and thought was quite cool.
[3] This phrasing isn’t mine, but as soon as I heard it, I loved it. The whole room got chatting about this very quickly so provenance was lost I’m afraid.

Warning: 400 words of geekery ahead!

I’ve embarked on an extremely nerdy and wonderful new project: a podcast about rewatching Star Trek. Each week we encourage listeners to watch the same episode we have, and then we’ll dissect and discuss it in deliciously geeky detail.

My cohost in this trek beyond the podcasting frontier is friend and fellow Zooniverse workhorse Grant Miller. Grant sometimes fills in for Chris Lintott on our regular astronomy/science series Recycled Electrons. The other week myself and Grant ended up talking about Star Trek on the show, and a friend remarked that they’d totally listen to us doing a podcast about Star Trek. We’re easily persuaded by flattery and so ‘Star Trek: The Rewatch’ was born.

The Rewatch

Our first episode is now up, in which we discuss Encounter at Farpoint – the pilot episode of Star Trek: The Next Generation. We’re hoping that re-watching the show, in order, will make us appreciate it all over again. When you a TV show like this, you get to enjoy each episode in the context of knowing the series and characters really well, and there’s loads of interesting trivia and back-story that is great to explore. Grant and I are both astrophysics PhDs, so we’re also hoping to bring some serious science talk to the show from time to time. We know lots of experts in various fields from around the University (of Oxford) who we hope can pop in and comment every now and then.

Although I’m obviously looking forward to some of my favourite episodes (e.g. Best of Both Worlds, Tapestry) I’m also keen to see how some of the older, or more obscure, episodes hold up to the ensuing decades and changes in the way we enjoy Sci Fi, and TV in general. This is a podcast that I would totally have listened to – so it’s going to be fun to record it. To be honest, even if no one listens, this is going to be awesome! That’s how geeky I am.

If you loved the adventures of Captain Picard and co. – and want to watch them all over again – then join us! Check out startrek.therewatch.com or find us on iTunes. We’re also to be found on Twitter @StarTrekRewatch and on Facebook too.

Engage!

Yesterday was the Hack Day at the UK National Astronomy Meeting 2014 in Portsmouth. I organised it with my good friend Arfon Smith of GitHub, formerly Zooniverse. We wanted to try and start a new NAM tradition – it went well so maybe we did. I’m psyched that .Astronomy got to help make it happen – not just through my involvement, but the many .Astronomy alumni who attended!
Some of the hack projects have already started to appear online, such as Geert Barentsen, Jeremy Harwood, and Leigh Smith (Hertfordshire) who created a Martian Nyan Cat, which is planning to fly over the entirety of ESA’s Mars Express data archive in one continuous, two-day-long, flight. You also grab the code for Duncan Forgan’s beautiful ‘Music of the Spheres’ project, which sonifies the rhythms of planetary systems. Other projects are harder to place online, such as Jane Greaves’ knitted galaxy cluster – with dark matter contributed by many people during the hack day itself.

I spent much of the day working with Edward Gomez (LCOGT) on the littleBits Space Kit. littleBits is a modular system of circuits that let anyone try their hand at something that ordinarily requires a soldering iron. littleBits components may be switches, sensors, servos, or anything really, and they connect magnetically to create deceptively simple circuits that can be quite powerful.

IMG_2359

For example you could connect an infrared sensor and an LED to make a device that flashes when you press buttons on your remote. Or you could use a microphone and a digital LCD display to create a sound meter. The littleBits components are sturdy enough to withstand being bashed about a bit, and simple, and large enough, to let you stick on cardboard, homemade figures, or anything else you find around the house. I found out about littleBits when I met their creator, Aya Bdier at TED in March. She is a fellow TED Fellow.

We decided fairly quickly to try and built an exoplanet simulator of some sort and ended up crating the littleBits Exoplanet Detector (and cup orrery). There were two parts to this: a cup-based orrery, and a transit detector.

The cup orrery consisted of a rotating ‘planetary system’ fashioned from a coffee cup mounted on a simple motor component – we only had hack day supplies to play with – and a central LED ‘star’. Some more cups and stirrers were required to scaffold the system into a stable state but it was soon working.

The transit detector used a light-sensor component that read out to both a speaker and a LCD numerical display – Ed refers to this as the laser display board. With a bit of shielding from the buffet’s handy, black, plastic plates the light sensor can easily pick up the LED and you can see the light intensity readout varying as the the paper planet passes in front of the star. It was awesome. We got very excited when it actually worked!

You might think that was geeky enough, but it gets better. I realised I could use my iPhone 5s – which has a high-frame-rate video mode – to record the model working in slow motion and allow us to better see the digital readout. We also realised that the littleBits speaker component can accept an audio jack and so could use the phone to feed in a pure tone, which made it much easier to hear the pulsing dips of the transits.

Finally, we had the idea to record this nice, tonal sound output from the detector and create waveforms to see if we could recover any properties about the exoplanets. And sure enough: we can! We built several different coffee-cup planetary systems (including a big planet, small planet, and twin planets) and their different properties are visible in their waveforms. Ed is planning a more rigorous exploration of this at a later date, but you can see and hear the large cup planet’s waveform below.

Waveform for Large Cup Planet

Waveform for Large Cup Planet

So if you want to try something like this, you only need the littleBits Space Kit. You can buy them online and I’d love to see more of these kits, and to see them in schools. I’m now totally addicted to the idea myself too!

GitHub Stickers

Thanks to Arfon for suggesting that we do this Hack Day together; to the NAM 2014 Portsmouth team for being so supportive; and to GitHub for sponsoring it – where else would we have gotten all the cups?!