Archives For Zooniverse

Screenshot 2014-09-09 16.20.32

I’ve been called a lot of things but ‘rebel’ hasn’t come up too often. Not that I mind. As part of a Mazda campaign, I’m being highlighted as one of four TED Fellows* who are ‘Mazda Rebels’. The other three are thoroughly impressive and I recommend you take a look. There’s an online vote where the pubic help chose whoever they think the deserves a Mazda grant to help their project.

My video can be found here. It’s lovely and I really enjoyed making it. It nicely describes my work with Zooniverse (special guest starring Brooke Simmons as a Snapshot Serengeti volunteer!) in a fun, accessible way. We had a laugh creating it, and they have kept many of the out-takes in the video, which I rather enjoyed.

If I win the vote then I’ll be using the money to kick-start the Zooniverse’s efforts in disaster relief with a ‘First Responders’ project. Think Milky Way Project but with aerial photos of recent disasters, with volunteers helping locate resources, danger, and people. This is something several of us at Zooniverse HQ are very keen on, and using the power of crowdsourcing in realtime after a disaster makes a lot of sense.

I highly recommend that you take a look at all four videos and vote for your favourite here: https://www.mazdarebels.com/en-gb/content/four-inspiring-ted-fellows-one-mazda-grant/

* Applications are still open to become a 2015 TED Fellow – I can highly recommend it!

daily_Zooniverse

‘Something awesome from the Zooniverse every day’ was the tagline that we came up with, almost a year ago, for a new Zooniverse blog: Daily Zooniverse. Grant Miller had recently arrived to work at Zooniverse HQ in Oxford and I had a todo list of things I’d always wanted to try but hadn’t found the time for. The Daily Zooniverse was right at the top.

The Zooniverse has spawned more than 30 citizen science projects, generated almost 100 peer-reviewed academic publications, and engaged more than one million people! Surely we had the capacity to share one cool thing every day? That was the challenge I laid at Grant’s feet last year and he has risen to it. Somehow, for the past 359 days, Grant has managed to post something (anything!) Zooniverse-related to the blog at daily.zooniverse.org.

Team birthdays, project status updates, suggested projects, and galaxy of the week are some examples of the blog’s regular features. There are the new projects that launch, the cool things the community find on Talk, and the awesome finds that just appear from seemingly nowhere. I love following this blog because it adds little bit of Zooniverse into my RSS feed each day. I often see things I didn’t know about myself!

Screenshot 2014-09-08 08.53.32

Congratulations to Grant on the blog’s birthday this week! Find the blog at daily.zooniverse.org or follow it via RSS, Twitter, Facebook, G+, and Tumblr.

Today is the start of the UK National Meeting in Portsmouth. I’ll be there tomorrow, and running the NAM Hack Day on Wednesday with Arfon Smith – which is going to be awesome. Today at NAM, the nation’s astronomers will discuss the case for UK involvement in the Large Synoptic Survey Telescope project – the LSST. The LSST is a huge telescope, and a massive undertaking. It will change astronomy in a profound way.

A photograph and a rendering mix of the exterior building showing the dome open and road leading away from the site.

A photograph and a rendering mix of the exterior LSST building, showing the dome open and road leading away from the site.

With every image it takes, the LSST will be able to record very a large patch of sky (~50 times the size of the full Moon). It will take more than 800 images each night and can image its* entire sky twice a week! Billions of galaxies, stars, and solar system objects will be seen for the first time and monitored over a period of 10 years. Crucially it will use it’s rapid-imaging power to look for moving or ‘transient’ things in the night sky. It will be an excellent tool for detecting supernova, asteroids, exoplanets and more of the things that move from night-to-night or week-to-week. For example, the LSST could be used to detect and track potentially hazardous asteroids that might impact the Earth. It will also help us understand dark energy – the mysterious force that seems to keep our universe expanding – by mapping the precise location of billions of galaxies.

I’ve recently become LSST:UK’s Public Data Coordinator – think ‘chief hacker’ if you prefer. The LSST’s unprecedented archive of data will be a resource we can tap into to create new kinds of public outreach tools, data visualisations, and citizen science. In recent years, we at the Zooniverse have pioneered citizen science investigations of data in astronomy**. The citizen science and amateur astronomy communities around the UK, and the world, will be able to access the amazing data that comes out of the LSST both through structure, Zooniverse-style projects but also in a more freeform manner. The potential for discovery will be on a scale we haven’t seen before. It’s very exciting.

The LSST is a public-private partnership and is led by the United States. The unique scientific opportunities presented by the LSST have led to the formation of a group of astronomers from more than 30 UK universities. We’ll be asking for funding from the Science and Technology Facilities Council to support UK participation in the project.

Spinnaker Tower from the Gosport Ferry

Spinnaker Tower from the Gosport Ferry

If you’re at NAM this week, then I’ve love to talk about LSST, hacking on data, and Zooniverse. On Wednesday you’ll find me in the Park Building, at the University of Portsmouth at the GitHub/.Astronomy NAM 2014 Hack Day. I’ll also be at the GitHub drink up on Tuesday night at The White Swan from 7pm – where you can enjoy some of the finest cask ales, draught beers and wines in Portsmouth – and GitHub are paying! More details at https://github.com/blog/1849-github-meetup-in-portsmouth-uk.

* i.e. the sky visible from its location – not literally the entire sky
** We’ve now had more than 1 million volunteers pass through our digital doors.

Operation War Diary Screenshot

Working at the Zooniverse means that I get to indulge many of my interests beyond astronomy, like history. In January we launched a project in partnership with the Imperial War Museum and the National Archives called Operation War Diary. It’s a ‘citizen history’ site that asks the public to tag and transcribe more than one million war diaries, and other handwritten notes, produced on the western front during the First World War. 2014 is the centenary year of the start of the war and we hope that this project will recover information that had been all but lost over the last one hundred years.

The results of this project are starting to appear now. The project is meant to run for several years, but there are already new ways to explore and understand the data thanks to effort of the tens of thousands of people who have taken part in Operation War Diary. As an example, the video below is an animation of the casualties reported in the diaries tagged so far. You can also see this map online at http://cdb.io/1pqB4kp.

This map was created by Operation War Diary developer Jim O’Donnell and it’s not a final map by any means but it shows the power of crowdsourcing these kinds of tasks. If that all intrigues you, you can involved here http://www.operationwardiary.org/ – and read more about the project in this blog post. You can follow Operation War Diary on Facebook and Twitter too!

I shared this today, in a presentation, as an example of the many projects created by the Zooniverse team, which could be described as ‘Social Machines’. I work as part of a project called SOCIAM, which is investigating these Social Machines as a research topic.

Social machines were predicted in the early days of the web and are an emergent, social entity typified by large group of people working together online to achieve things neither the machine nor the human network could otherwise achieve. Zooniverse projects are a great example of a social machine for scientific discovery, which is why we were invited to join this collaboration.

Dave DeRoure's 'classic' social machines explanation chart

Dave DeRoure’s ‘classic’ social machines explanation chart

 

Social machines often involve very large-scale human participation; they may generate large volumes of data; and they may try to solve social or technical problems from the reverse perspective. Wikipedia is a social machine too, for example, as is Reddit, eBay, and Ushahidi.

Operation War Diary and the other Zooniverse projects combine people in a way that can only be achieved through the web, and many of the participants then contribute in new and unexpected ways*, enriching the overall output of the platform. This makes it a notable social machine, and a great citizen science platform.

Sorry – this has been a rather rambling post. So to conclude here’s a link to many more Operation War Diary maps, prodded by Jim, on CartoDB: https://the-zooniverse.cartodb.com.

[* See Spanish Flu in Old Weather, Yellow Balls in MWP, Green Peas in Galaxy Zoo]

A new Milky Way Project paper was published to the arXiv last week. The paper presents Brut, an algorithm trained to identify bubbles in infrared images of the Galaxy.

bubble_gallery_sorted_v2

Brut uses the catalogue of bubbles identified by more 35,000 citizen scientists from the original Milky Way Project. These bubbles are used as a training set to allow Brut to discover the characteristics of bubbles in images from the Spitzer Space Telescope. This training data gives Brut the ability to identify bubbles just as well as expert astronomers!

The paper then shows how Brut can be used to re-assess the bubbles in the Milky Way Project catalog itself, and it finds that more than 10% of the objects in this catalog are really non-bubble interlopers. Furthermore, Brut is able to discover bubbles missed by previous searches too, usually ones that were hard to see because they are near bright sources.

At first it might seem that Brut removes the need for the Milky Way Project –  but the ruth is exactly the opposite. This new paper demonstrates a wonderful synergy that can exist between citizen scientists, professional scientists, and machine learning. The example outlined with the Milky Way Project is that citizens can identify patterns that machines cannot detect without training, machine learning algorithms can use citizen science projects as input training sets, creating amazing new opportunities to speed-up the pace of discovery. A hybrid model of machine learning combined with crowdsourced training data from citizen scientists can not only classify large quantities of data, but also address the weakness of each approach if deployed alone.

We’re really happy with this paper, and extremely grateful to Chris Beaumont (the study’s lead author) for his insights into machine learning and the way it can be successfully applied to the Milky Way Project. We will be using a version of Brut for our upcoming analysis of the new Milky Way Project classifications. It may also have implications for other Zooniverse projects.

If you’d like to read the full paper, it is freely available online at at the arXiv – and Brut can found on GitHub.

[Cross-posted on the Milky Way Project blog]

TED 2014 has just ended here in Vancouver and I have finally now experienced an event I’ve heard a lot about for many years. I’ve watched TED talks online for as long as I’ve watched anything online and the real deal did not disappoint. Attending TED for the first time has been intense, wonderful, and dizzying and it was great to be here for it’s special, 30th anniversary year. Highlights from my experience are difficult to streamline into a blog post. So this is my best shot.

BjH1yQtCIAA8SgT.jpg-large

Each day at TED has presented numerous inspirational speakers and amazing ideas, curated into themes by the organisers such as ‘Liftoff!’, ‘Reshape’, ‘Hacked’, and ‘Onward’. These sessions took place in the event’s central venue; a custom-designed, wooden amphitheatre – built over only a few days before the event opened. These talks were usually 12 or 18 minutes long and sometimes formatted as interviews where relevant. The changing topics and formats were paced in a way that meant I rarely felt tired or restless – which is amazing since I’ve had about 5 hours sleep each night! You can already see some talks from these sessions including Colonel Chris Hadfield on conquering fear, and Edward Snowden on privacy – a talk delivered via a roving telepresence robot!

A secondary, slightly smaller venue housed the ‘All-Star’ sessions. These were totally packed-out as they consisted of many notable TED speakers from the past 30 years, each giving 4 minute talks to update or reflect on their work and ideas. Speakers here included a wide range of awesome folks such as Sir Tim Berners-Lee, Imogen Heap, Dan Gilbert, Jimmy Wales, Sir Martin Rees, and even General Stanley McChrystal. All of them had just 4 minutes, which kept the energy high, and the pace steady.

TED2014_RL_2R9B7861_1920

I was very happy to see that some of the best talks of the week were about science. Ed Yong (Nat. Geo. blogger) told us about parasites, Sara Lewis (Tufts) about fireflies and Andy Connolly (U. Washington) about the future of astronomy. All extremely well-crafted and well-delivered talks about often complex topics. Hugh Herr’s (MIT Media Lab) outline of the future of bionics and prosthetic limbs was not just a tale of amazing science, but also included a live performance by a ballroom dancer who lost her leg in the Boston Marathon bombings and can now dance again thanks to the help of his MIT lab. A perfectly ‘TED’ moment and a moving thing to witness.

There is also a special place in my heart for some of the technology speakers, including Margaret Gould Stewart, who talked about designing and changing Facebook and its impact on user behaviour; and Del Harvey who won everyone over with her sardonic delivery of a talk about managing the stranger side of Twitter. Keren Elazari delivered a moving lesson in why hackers may keep us all safe and keep governments honest – a talk that will be timely if posted quickly to TED.com, and which proves how refreshing and important it is to have women talk about tech. Something that is all too rare.

Supermodel Geena Rocero came out as transgender live on the TED stage; Mellody Hobson gave a challenging and optimistic talk on race; and Mark Ronson gave a talk/performance about remixing and reclaiming music – partly involving a live remixing of other TED talks within his own. It was pretty genius – though it may have been lost on a large chunk of the audience.

The average age of TED attendees is predictably quite high – and I think that must be part of the thinking behind the TED Fellows program – the whole reason I’m here with about 20 other folks from around the world. By supporting its growing Fellows community, TED is creating new connections and networks, but also injecting a chunk of people into the conference that otherwise would not be able to attend. As part of our participation, all the fellows give their own short talk at the opening of the event.

The prospect of giving my own 4 minute talk on Monday was a big part of my life leading up to the conference. 4 minutes is not a long time, and that fact seemed only to amplify the preparation required, and the intensity of my nerves. I felt shaky and sick as I walked into the lights on Monday morning, but once there, a strange calm fell upon me and I simply delivered my talk. My intense preparation suddenly seemed like a wise investment, and although I can’t say I relaxed, I definitely enjoyed it. Those 4 minutes flew by in the end.

I used my talk to highlight the wonderful work we do at the Zooniverse, and framed what we do in the context of ‘big data’ in science, and in discoveries that are waiting to be made if we allow the public access to our data. I think it went well, and I’ve certainly had many attendees and journalists ask me about in the days since. Not all the Fellows talks go online – but we’ll each get to see our own eventually. They edit them and send them out in the weeks and months to come.

The TED Fellows programme also provides coaching, mentorship, training and a bunch of other amazing experience and advice too. I can’t recommend it highly enough. A huge thank you goes to Tom Reilly, Shoham Arad, Sam Kelly, Corey Mohr, Patrick D’Arcy and the whole TED Fellows team. I’m so excited about the collaborations and ideas being generated between the group and what we can do in the future.

Giving the talk was unforgettable, and attending TED as been a dream come true. I am feeling motivated and inspired, but most importantly I’ve made lots of new connections and contacts for projects to work on in the near future. For now though we have one more engagement: a farewell dinner with the rest of the TED Fellows. Then it will be time to go back to Oxford and resume regularly scheduled programming. After .Astronomy I always get the .Astro blues and I can tell it will be the same for TED, but it is time to head home and see how I can take all these ideas and actually do something with them.

This was recorded at the Citizen Cyberscience Summit in London in February – it’s me summarising the Zooniverse for anyone out there that might like to try out our own brand of Citizen Science.

This week is the BBC’s Stargazing Live show: three now-annual nights of live stargazing and astronomy chatter, live from Jodrell Bank. CBeebies are also getting in on the act this year, which I’m excited about. The Zooniverse are part of the show for the third year running and this time I have the pleasure of being here on set for the show. In 2012 the Zooniverse asked the Stargazing Live viewers to help us discover a planet with Planet Hunters, in 2013 we explored the surface of Mars with Planet Four. This year we are inviting everybody to use our Space Warps project to discover some of most beautiful and rare objects in the universe: gravitational lenses.

Space Warps asks everyone to help search through astronomical data that hasn’t been looked at by eye before, and try to find gravitational lenses deep in the universe. We launched the site in 2014 and for Stargazing Live we’re adding a whole new dataset of infrared images. Your odds of finding something amazing are pretty good, actually!

Gravitational lenses occur when a massive galaxy – or cluster of galaxies – pass in front of more distant objects. The enormous mass of the (relatively) closer object literally bends light around it and distorts the image of the distant source. Imagine holding up a magnifying glass and waving it around the night sky so that starlight is bent and warped by the lens. You can see more about this here on the ESO website.

We’ve been getting things ready all day and now I’m sitting here in the Green Room at Jodrell Bank  waiting for the show to begin. Stargazing Live is an exciting place to be and everyone is buzzing about the show! That Chris Lintott bloke from the telly is here, as is K9 is from Doctor Who – they both look excited.

milkyway

Just over three years the Zooniverse launched the Milky Way Project (MWP), my first citizen science project. I have been leading the development and science of the MWP ever since. 50,000 volunteers have taken part from all over the world, and they’ve helped us do real science, including creating astronomy’s largest catalogue of infrared bubbles – which is pretty cool.

Today the original Milky Way Project (MWP) is complete. It took about three years and users have drawn more than 1,000,000 bubbles and several million other objects, including star clusters, green knots, and galaxies. It’s been a huge success but: there’s even more data! So it is with glee that we have announced the brand new Milky Way Project! It’s got more data, more objects to find, and it’s even more gorgeous.

Screenshot 2013-12-12 11.58.42

This second incarnation of my favourite Zooniverse project[1] has been an utterly different experience for me. Three years ago I had only recently learned how to build Ruby on Rails apps and had squirrelled myself away for hours carefully crafting the look and feel for my as-yet-unnamed citizen science project. I knew that it had to live up to the standards of Galaxy Zoo in both form and function – and that it had to produce science eventually.

Building and launching at that time was simpler in one sense (it was just me and Arfon that did most of the coding[2]) but so much harder as I was referring to the Rails manual constantly and learning Amazon Web Services on the fly. This week I have had the help of a team of experts at Zooniverse Chicago, who I normally collectively refer to as the development team. They have helped me by designing and building the website and also by integrating it seamlessly into the now buzzing Zooniverse infrastructure. The result has been an easier, smoother process with a far superior end result. I’ve essentially acted more like a consultant scientist, with a specification and requirements. I’ve still gotten my hands dirty (as you can see in the open source Milky Way Project GitHub repo) but I’ve managed to actually keep doing everything else I now to day-to-day at the Zooniverse. It’s been a fantastic experience to see personally how far we’ve come as an organisation.

The new MWP is being launched to include data from different regions of the galaxy in a new infrared wavelength combination. The new data consists of Spitzer/IRAC images from two surveys: Vela-Carina, which is essentially an extension of GLIMPSE covering Galactic longitudes 255°–295°, and GLIMPSE 3D, which extends GLIMPSE 1+2 to higher Galactic latitudes (at selected longitudes only). The images combine 3.6, 4.5, and 8.0 µm in the “classic” Spitzer/IRAC color scheme[3]. There are roughly 40,000 images to go through.

GLM_261.3032+00.8282_mosaic_I124

An EGO (or two) sitting in the dust near a young star cluster

The latest Zooniverse tech and design is being brought to bear on this big data problem. We are using our newest features to retire images with nothing in them (as determined by the volunteers of course) and to give more screen time to those parts of the galaxy where there are lots of pillars, bubbles and clusters – as well as other things. We’re marking more objects –  bow shocks, pillars, EGOs  – and getting rid of some older ones that either aren’t visible in the new data or weren’t as scientifically useful as we’d hoped (specifically: red fuzzies and green knots).

It’s very exciting! I’d highly recommend that you go now(!) and start classifying at www.milkywayproject.org – we need your help to map and measure our galaxy.

—–

[1] It’s like choosing between your children

[2] Arfon may recall my resistance to unit tests

[3] Classic to very geeky infrared astronomers

zooniverse-icon-web-black

We’re expanding the Zooniverse team in Oxford and we’re looking for web developers. You need to be able to work in Oxford (which is a lovely place to work) and you need to want to change the way science is done! There are four positions we need to fill:

Each of these roles has different responsibilities and there’s a range of skills that we’re after. We’re creating a core team of developers here in Oxford to work alongside our Chicago-based developers – but on different, new parts of the Zooniverse. You’ll not be building citizen science projects on a daily basis: instead these positions will mostly deal with infrastructure, pipelines and tools for citizen science. In my opinion it’s an amazing opportunity for any developers out there who love science. You will work within a team of about 10 people here in Oxford Astrophysics.

We’re really excited about the project that these people will be the core part of and I definitely encourage coder-type scientists and science-type coders to apply. The University is a great employer with a good pension scheme, mostly flexible hours and they’re very friendly towards families. We are a mixed group of developers, scientists and something in-between. Best of all, like the Zooniverse: we’re awesome.

So come, join us!