Archives For Interesting

Line 1. Let’s start with ‘typical’ humans. The average human adult male is 1.75 metres tall – that’s 3.83 cubits or 5.74 feet. The average female is 1.62 metres – that’s 5.4 light-nanoseconds or 0.008 furlongs.

20141126 - RJS - Stargazing Scale.002

You live on Earth (Sol d, perhaps?). This is an Earth-like planet in a Sun-like star system. The third planet of eight in a rich system, including a least one planet populated entirely by robots (Mars, perhaps?). Earth is 12,742 km in diameter and thus has a circumference of 40,000 km or roughly 25,000 miles. Humans live in a thin layer (~20km) around the surface called the troposphere. If the Earth was a beach ball then all life on Earth exists within just 1mm around the surface.

20141126 - RJS - Stargazing Scale.004

Through many years of international effort we have managed to keep a ‘space’ station in orbit – just above this troposphere – 1cm above the beach ball. But not high enough up that it can totally avoid the atmosphere – the ISS has to constantly boost itself back up because of air drag. We have sent just 24 people out into deep space, beyond the Earth’s atmosphere. All of then visited the Moon and the last ones returned in 1972: 42 years ago. They were all men, all white, and all American. We could do it again, we could do it better – but we chose not do so. (Mostly for political reasons IMHO.)

20141126 - RJS - Stargazing Scale.008

Those astronauts visited the nearest body in space: the Moon – the second brightest thing in the sky . They were kind enough to return some photos to show us how teeny tiny we are, and how delicate out world really is. The Moon sits about a quarter of a million miles away (384,000 km). You could fit all the Solar System’s other planets in that gap.

20141126 - RJS - Stargazing Scale.012

But that doesn’t include the Sun – the brightest thing in the sky. The Sun is truly huge. You can fit the Earth inside the Sun a million times. It has more than enough room for all the planets and then some. The Sun itself sits 93 million miles away – which means that light takes 8 minutes to reach us from the Sun. The Sun could have gone out 7.9 minutes ago and you’d only find out… now. Nope: we’re ok. For now.

20141126 - RJS - Stargazing Scale.013

And yet we have flung robots into space and downloaded the images they have recorded. Sometimes we take extremely long-range selfies of a sort. Images of the Earth, of humanity reduced to a pixel or two. Here’s one from Mars, one from Saturn and one from out near the edge of the Solar System – taken by Voyager. These images collectively earn us the moniker ‘pale blue dot’. Out by Pluto, the Sun itself is has dimmed to look like an other stars. From Saturn, we are just a couple of pixels as seen by the Cassini probe:

20141126 - RJS - Stargazing Scale.017

And truthfully, the Sun isn’t so special. In fact there are stars which make the Sun look even smaller than the Earth does here. VY Canis Major is staggeringly big – and could encompass the Sun 1,000,000,000 times. That’s a million trillion Earths. Oh and VY Canis Major isn’t even visible to the naked eye because it’s so far away that we can’t detect its photons without aid of telescopes or binoculars.

Which brings us to the Galaxy. The Sun is just one of hundred of billions of stars orbiting around the Milky Way. If the Sun was a blood cell then the Milky Way is the size of Europe. The Milky Way is staggeringly big also staggering diffuse – so much so that if you took two Milky Ways, and hit one with the other, then in all likelihood no two stars would collide. They would pass though each other like smoke.

In fact this will happen. The Andromeda galaxy – which is a lot like the Milky Way – is on a collision course with us. In about 4 billion years it will begin to merge with our galaxy in a spectacular collision. We see these happening elsewhere but the sheer scale of this vision in our own night sky makes me want to get a time machine and jump forward to see it happen. The Earth is unlikely to be affected by this, because of the lack of collisions – however our night sky will be spectacularly altered for hundred of billions of years. Makes you realise how dull it is right now. Just kidding!

654291main_p1220bk

But the Milky Way and Andromeda are just two out of hundred of billions of galaxies in the Universe. Gigantic stellar continents floating in a vast, void of almost nothing. Galaxies themselves form structures, and as we have looked deep into the cosmos we have seen one such structure: the Sloan Great Wall. A thick chain of galaxies, loosely bound to each other by gravity, stretching 1.4 billion light years across the Universe and about 1 billion light years from the Milky Way. It’s 1/60 of the Universe across. And yet there are even bigger thing out there.

20141126 - RJS - Stargazing Scale.026

The largest known structure in the Universe is the Hercules–Corona Borealis Great Wall. At 10 billion light years across, this huge filament of galaxies in 1/10 the size of the observable Universe. It’s 100,000 time the size of the Milky Way, and 70 million trillion times bigger than than the Sun. We don’t have a good picture of it, but we know it’s there. It’s 7,000,000,000,000,000,000,000 times bigger than the Earth, which is very much bigger than you. I refer you to line 1.

Help Count the Stars

January 1, 2015 — 1 Comment

plough

Here’s a fun thing to do this January: help count the stars to see how dark the sky is near you. While you’re looking for Comet Lovejoy, take a moment to count some stars for a school project.

Over the past few weeks I’ve been helping A-Level student, and fellow Witney resident, Jesse Lawrence with a BSA Crest Award project. He opted to go for something with a local twist and has decided to map the quality of the dark skies around Witney. Now he’s embarked on the last phase of his project: crowdsourcing a dark sky map by recruiting volunteers (that’s you!).

It would be fantastic if you could add your own observations to the project. All you have to do is count the stars an fill in this form. For now, you need to be located in the Northern Hemisphere.

Although this began as a local project, the system is up and running and will work at scale so please fill in the form from anywhere – not just Witney.

You have to go out on a clear night and then report your location (your postcode or lat/long) along with the faintest star you can see in the Plough (or saucepan, or big dipper, in Ursa Major). You just need to use Jesse’s map on the online form at http://bit.ly/DarkSkyMap. Find the faintest star that you can see from those marked with letters on the form. That helps identify the limit of brightness for your location. Repeats over several nights will help average a better result, as will multiple people observing from the same spot over time.

Screenshot 2015-01-02 21.23.12

The results appear on a live-updated map, which you can see at http://cdb.io/1rdVb4O. The more people that join in, the better the final map will be.

Screenshot 2014-09-09 16.20.32

I’ve been called a lot of things but ‘rebel’ hasn’t come up too often. Not that I mind. As part of a Mazda campaign, I’m being highlighted as one of four TED Fellows* who are ‘Mazda Rebels’. The other three are thoroughly impressive and I recommend you take a look. There’s an online vote where the pubic help chose whoever they think the deserves a Mazda grant to help their project.

My video can be found here. It’s lovely and I really enjoyed making it. It nicely describes my work with Zooniverse (special guest starring Brooke Simmons as a Snapshot Serengeti volunteer!) in a fun, accessible way. We had a laugh creating it, and they have kept many of the out-takes in the video, which I rather enjoyed.

If I win the vote then I’ll be using the money to kick-start the Zooniverse’s efforts in disaster relief with a ‘First Responders’ project. Think Milky Way Project but with aerial photos of recent disasters, with volunteers helping locate resources, danger, and people. This is something several of us at Zooniverse HQ are very keen on, and using the power of crowdsourcing in realtime after a disaster makes a lot of sense.

I highly recommend that you take a look at all four videos and vote for your favourite here: https://www.mazdarebels.com/en-gb/content/four-inspiring-ted-fellows-one-mazda-grant/

* Applications are still open to become a 2015 TED Fellow – I can highly recommend it!

publications

Executable papers are a cool idea in research [1]. You take a study, write it up as a paper and bundle together all your code, scripts and analysis in such a way that other people can take the ‘paper’ and run in themselves. This has three main attractive features, as I see it:

  1. It provides transparency for other researchers and allows everyone to run through your working to follow along step-by-step.
  2. It allows your peers to give you detailed feedback and ideas for improvements – or do the improvements themselves
  3. It allows others to take your work and try it out on their own data

The main problem is that these don’t really exist ‘in the wild’, and where they do they’re in bespoke formats even if they’re open source. iPython Notebook is a great way of doing something very much like an executable paper, for example. Another way would be to bundle up a virtual machine and share a disk image. Executable papers would allow for rapid-turnaround science to happen. For example, let’s imagine that you create a study and use some current data to form a theory or model. You do an analysis and create an executable paper. You store that paper in a library and the library periodically reruns the study when new data become available [2]. The library might be a university library server, or maybe it’s something like the arXiv, ePrints, or GitHub.

This is roughly what happens in some very competitive fields of science already – only with humans. Researchers write papers using simulated data and the instant they can access the anticipated data the import, run and publish. With observations of the Cosmic Microwave Background (CMB) it is the case that several competing researchers are waiting to work on the data – and new data come sour very rarely. In fact that day after the Planck CMB data was released last year, there was a flurry of papers submitted to the arXiv. Those who got in early, likely had pre-written much of the work and simply ran their code as soon as they had downloaded and parsed new, published data.

If executable papers could be left alone to scan the literature for new, useful data then they could also look for new results from each other. A set of executable papers could work together, without planning, to create new hypotheses and new understanding of the world. Whilst one paper crunches new environmental data, processing it into a catalogue, another could use the new catalogue to update climate change models and even automatically publish significant changes or new potential impacts for the economy.

I should be possible to make predictions in executable papers and have them automatically check for certain observational data and automatically republish updated results. So one can imagine a topical astronomy example where the BICEP2 results would be automatically checked against any released Planck data and then create new publications when statistical tests are met. Someone should do this if they haven’t already. In this way, papers can continue to further, or verify, our understanding long after publication.

SKA Rendering (Wikimedia Commons)

SKA Rendering (Wikimedia Commons)

This is high-frequency science [3], akin to high-frequency trading, and it seems like an interesting approach to some upcoming data-flow issues in science. The Large Hadron Collider (LHC), Large Synoptic Survey Telescope) LSST, and Square Kilometre Array (SKA) are all huge scientific instruments set to explore new parts o the universe and gathering huge volumes of data to be analysed.

Even the deployment of Zooniverse-scale citizen science cannot get around the fact that instruments like the SKA will create volumes of data that we don’t know what to do with, at a pace we’ve never seen before. I wonder if executable papers, set to scour the SKA servers for new data, could alleviate part of the issue by automatically searching for theorised trends. The papers would be sourced by the whole community, and peer-reviewed as is done today, effectively crowdsourcing the hypotheses through publications. This cloud of interconnected, virtual researchers, would continuously generate analyses that could be verified by some second peer-review process; since one would expect a great deal of nonsense in such a setup.

When this came up at a meeting the other day, Kevin Page (OeRC) remarked that we might just be describing sensors. In a way he’s right – but these are software sensors, built on the platform and infrastructure of the scientific community. They’re more like advanced tools; a set of ghost researchers, left to think about an idea in perpetuity, in service of the community that created them.

I’ve no idea if I’m describing anything real here – of it’s just an expression of way of partially automating the process of science. The idea stuck with me and I found myself writing about it to flesh it out – thus here is a blog post – and wondering how to code something like it. Maybe you have a notion too. If so, get in touch!

———-

[1] But not a new one really. It did come up again at a recent Social Machines meeting though, hence this post.
[2] David De Roure outlined this idea quite casually in a meeting the other day, I’ve no ice air it’s his or just something he’s heard a lot and thought was quite cool.
[3] This phrasing isn’t mine, but as soon as I heard it, I loved it. The whole room got chatting about this very quickly so provenance was lost I’m afraid.

This week is the BBC’s Stargazing Live show: three now-annual nights of live stargazing and astronomy chatter, live from Jodrell Bank. CBeebies are also getting in on the act this year, which I’m excited about. The Zooniverse are part of the show for the third year running and this time I have the pleasure of being here on set for the show. In 2012 the Zooniverse asked the Stargazing Live viewers to help us discover a planet with Planet Hunters, in 2013 we explored the surface of Mars with Planet Four. This year we are inviting everybody to use our Space Warps project to discover some of most beautiful and rare objects in the universe: gravitational lenses.

Space Warps asks everyone to help search through astronomical data that hasn’t been looked at by eye before, and try to find gravitational lenses deep in the universe. We launched the site in 2014 and for Stargazing Live we’re adding a whole new dataset of infrared images. Your odds of finding something amazing are pretty good, actually!

Gravitational lenses occur when a massive galaxy – or cluster of galaxies – pass in front of more distant objects. The enormous mass of the (relatively) closer object literally bends light around it and distorts the image of the distant source. Imagine holding up a magnifying glass and waving it around the night sky so that starlight is bent and warped by the lens. You can see more about this here on the ESO website.

We’ve been getting things ready all day and now I’m sitting here in the Green Room at Jodrell Bank  waiting for the show to begin. Stargazing Live is an exciting place to be and everyone is buzzing about the show! That Chris Lintott bloke from the telly is here, as is K9 is from Doctor Who – they both look excited.

The View from Saturn

November 18, 2013 — Leave a comment

Saturn Wave 1

This image was taken by Cassini, the amazing spacecraft that has been orbiting Saturn and its Moons for a decade. This image shows a view toward the Sun from Saturn – the most distant planet normally visible with the naked eye. As well as showing Saturn’s rings in all their glory, several of Saturn’s moons are visible in this shot. Perhaps more amazingly though, the Earth and Moon are seen as a bright spot in the lower-right and Mars and Venus can be seen in the top-left. On July 19th 2013 the world was asked to wave at Saturn as this image was taken (and many people did).

Saturn Wave 2

If you want to feel humanity’s astronomical significance to its fullest just think of this photo, and then think of what it took to be able to obtain it. An amazing achievement for the Cassini team.

[Image Credit: NASA/JPL-Caltech/SSI]

Astronomy in Everyday Life

November 6, 2013 — 3 Comments

Astronomers are sometimes asked to defend public funding of their work. It’s difficult to answer because I really do think that there are lots of things we should do just because they’re interesting and enriching and that science shouldn’t be limited be what is economically beneficial. That said, astronomy is often given an easy ride because it is pretty and we have people like Neil deGrasse Tyson, Brian Cox and Dara O’Briain on our side. One approach is talk about how much useful stuff astronomy has produced.

When you look around your life – and your house – you’d be surprised at how much is connected to astronomy and space exploration. Assuming you’re like me (i.e. living in the UK in 2013) you probably own several pieces of space-based technology. For a start you most likely use WiFi – in fact you might be reading this via WiFi right now! WiFi is based on work by John O’Sullivan working at CSIRO in Australia. The WLAN (Wireless Local Area Network) provided by your router results from technology developed by Radio Astronomers in Australia, More than a billion people are using it in 2013!

wificert

There’s also your GPS device. GPS determines your position by receiving the signals given off by a network of satellites orbiting the Earth. By comparing the time delay in the arrival of the different signals, the GPS chip can figure out its exact latitude and longitude to within about 10m. The GPS system not only involves satellites but each of those satellites houses an atomic clock and must incorporate Einstein’s equations for general relativity in order to know its position precisely [1]. It might be the most space-aged thing you own!

There’s a small chance that you sleep on a Memory Foam mattress or pillow. Memory Foam was created by NASA in 1966 (in fact it was created by people being contracted by NASA) to develop a way to better cushion and secure people going in to space [3]. Similarly iodine water filters derive from NASA work in the 1970s to create safe drinking water on long missions and scratch-resistant glass coatings were created to create better visors for astronauts.

Memory Foam

Contrary to popular belief, Teflon (the non-stick courting on saucepans) was not invented by NASA for the Apollo programme. In fact, it already existed and was simply used by NASA, who may have helped popularise it in industry at the time. I’ll also not mention CCDs here, since I’m no longer sure that astronomy had much to do with their success! [2].

Outside of your home, there are many other places where the technology results from space research. There is a great deal of medical tech that comes from space exploration, which shouldn’t be surprising given that both fields are often trying to see or detect things in tricky or unusual environments. Software for detecting things in satellite imagery is being applied in medicine, including to detect the signs of Alzheimer’s disease in brain scan data. The detection of breast cancer tumours was vastly improved by techniques in radio astronomy and instruments than began as ways to delicately monitor the temperature of fragile telescope instruments is being used in neonatal care today. At the airport the X-Ray scanner uses tech derived from X-Ray telescopes [4] and they may sometimes check your bag or coat for traces of certain chemicals by placing it in a gas-chromatograph which was originally designed for a Mars mission [4].

Astronomers are often also coders and software developers. As well being responsible for the 2008 banking fiasco (I’m joking, maybe) they are also good at creating software that others find very handy. The visualisation software IDL is many astronomers’ language of choice and was developed developed in the 1970s at the Laboratory for Atmospheric and Space Physics (LASP) at the University of Colorado at Boulder [5]. IDL is used in lots of research today in areas including defence, climate monitoring and by companies like Texaco, BP, and General Motors [6].

time

All of this is just the practical, modern stuff. Let’s not forget another thing you hold very dear: time itself. The calendar, especially it’s handy monthly segments, are astronomical in origin. The second, which seems so commonplace (i.e. it happens all the time) was defined in terms of the Earth’s rotation until astronomers realised that the length of a day was changing and so suggested a switch to defining it in terms of the Earth’s orbit around the Sun. Then we realised using an atomic clock would make more sense and handed our time-deiing powers over to the particle physicists [7].

Finally I just want to say that yesterday a paper appeared on the arXiv titled ‘Why is Astronomy Important?’ and it prompted me to finished this blog post about astronomy in everyday life, which I’ve had kicking around for ages. A big thanks to Marissa Rosenberg, Pedro Russo, Georgia Bladon, and Lars Lindberg Christensen for their timely paper – and their handy references!

UPDATE: There are also two handy booklets on this topic from the Royal Astronomical Society, you can find them here and here.

———-

  1. http://www.physics.org/article-questions.asp?id=55
  2. If you’re in to digital photography then you may have debated the benefits of the CMOS and CCD imaging technologies. All digital cameras, camera phones and webcams use one of these two types of tech. CCDs were developed in 1969 at Bell Labs (the 2009 Nobel Prize was awarded to its inventors Smith and Boyle) and they became very popular in astronomy. CCDs are said to have popularised by their use in the Hubble Space Telescope but I’m not sure I buy it and can’t find evidence for it.
  3. http://en.wikipedia.org/wiki/Memory_foam
  4. http://www.nap.edu/catalog.php?record_id=12951
  5. http://en.wikipedia.org/wiki/IDL_(programming_language)
  6. http://arxiv.org/abs/1311.0508
  7. http://tycho.usno.navy.mil/leapsec.html

UNAWE's Citizen Science Astronomy Projects Poster - CAP conference 2013

This is a poster from CAP2013, which  am attending in Warsaw. Love the idea and the design. Follow @UNAWE on Twitter and find them online at http://unawe.org/.

.Astronomy 5: What’s Next?

September 20, 2013 — 3 Comments

The .Astronomy 5 Unphoto – Credit: Demitri Muna

As the fifth .Astronomy came to a close on Wednesday, I felt as I always do at the end of these meetings: tired, emotional and super-excited. It’s hard to explain the energy at these events. There is something almost magical in the air as the participants ‘click’ (usually about an hour in) and then begin talking, making and doing great work.

.Astronomy is about actually doing something. As Kelle Cruz and I remarked yesterday – we like ‘people that do shit’. At .Astronomy you feel that if someone has an idea we should just all try and make it happen. It could be the best thing ever, and failure is just a chance to learn. It’s not a common attitude in astronomy and it’s certainly difficult for many early-career people to think that way.

I’ve always been lucky. My PhD supervisor was very willing to let me try crazy things (he let me get distracted by creating .Astronomy for a start!). At the Zooniverse we have spent years now, just pushing code live and making new things. They’re not always perfect, but we learn every time and we have left a trail of marvellous creations on the way. Each new thing learns from the last.

We also absorb the ideas of others quickly, and encourage collaboration with new people. It’s this approach that led to the creations of some of our most interesting projects recently, such as Snapshot Serengeti, the Andromeda Project and Space Warps.

During his Keynote talk Tony Hey (Microsoft Research) showed a quote I’ve not seen before.

“If you don’t like change, you’re going to like irrelevance even less.” – General Eric Shinseki, retired Chief of Staff, U. S. Army

I think I might put this on my wall. It sums up perfectly how I see much of science and could easily be the motto of .Astronomy. Tony’s keynote was brilliant BTW and you can see it here. Tony spoke about the Fourth Paradigm and told the tale of how the availability of astronomical data led to the SDSS SkyServer, which sparked the creation of Galaxy Zoo, which sparked the Zooniverse. In a way, .Astronomy was partly sparked by Galaxy Zoo too.

The folks at .Astronomy have built many projects that embrace the web fully, with an ethos of sharing and participation. These projects are changing the way astronomy and outreach are done: Chromoscope, 365 Days of Astronomy, AstroBetter, Astropy, astro,js, and the Seamless Astronomy groups ‘Bones of the Milky Way‘ paper; there are more but these are excellent examples.

So after .Astronomy 5 I’m left wondering where to take it next in order to facilitate more of these projects. There were 40 hack day pitches at this year’s event. There were so many hack day reports the follow day (the 2-3 minute slots where people show off their results) that we had to over run into coffee and use up most of lunch time too. Many of those hacks will, I hope, soon be appearing on the .Astronomy blog when people have time to write them up. Some of them are already popping up on GitHub (e.g. d3po).

The other wonderful thing about the meeting was how it once again encouraged genuine debates and discussions that sound like they might actually lead to change. The unconference sessions on diversity in astronomy went beyond the usual format and did not fall in to the trap of collectively preaching to the choir. A document has been drafted with actionable ideas. I hope it is revisited soon. Similarly sessions of the future of academic publishing were not bogged-down in the usual complaints but actually became a real debate about practical things we could do differently.

There were also highly informative unconference sessions that would not have happened elsewhere; enthusiastic tutorials of Astropy, Authorea and the merits of combining noisy classifiers are all jumping to mind. These meetings organically emerge from the crowd at .Astronomy and they’re, interactive, productive, and brilliant.

So as I ponder on the future of .Astronomy (I’d love your thoughts) I’ll leave you with some of the wonderful video hacks that were produced at this year’s event. Don’t Call Me Colin is a song about a sad exoplanet from Niall Deacon, Emily Rice, Ruth Angus and others. There is also a timelapse of .Astronomy itself in action from Amanda Bauer.

Thank you to everybody who took part, gave their time to talk, help organise the event; and who followed along online. It was a great meeting and I’m already looking forward to the next one. Long live #dotastro!

During the Perseid meteor shower, I blogged a video of a bright meteor taken by astrophotographer Mel Gigg. He had shared the image fairly widely and soon others noticed that they had caught the exact same shooting star themselves. In fact four observers had caught the same object as it flew into the atmosphere above Southern England, three of them have shared their images online (Wayne Young, Mel Gigg and Steve Night).

Credit: Wayne Young

Credit: Wayne Young

Credit: Mel Gigg

Credit: Mel Gigg

Credit: Steve Knight

Credit: Steve Knight

Look carefully and you’ll see that these images show the same streak of light but against drastically different star fields. That’s because meteors are high above the ground and visible across a large area. Due to the effect of parallax, they appear to shift relative to the night sky for different observers. In the extremes, an observer underneath the meteor, would see it go directly overhead, whereas others might see it from the side, where it would appear to fly nearer to the horizon. In this case it was seen by four people from different positions so they each had a different angle on the meteor and a different backdrop of stars.

In a wonderful example of citizen science, Wayne Young (one of the four photographers) took the four images and the lat/long data of each observer’s location, and created a 3D model of this particular Perseid’s path. You can see it below modelled in Google Earth (KML file here).

To create this he’s triangulated the path of the meteor by comparing each of the four images to one another. Given the capabilities of computer vision tools and astrometry.net, I wonder how much of this could be automated. It wouldn’t be hard to search Flickr for shooting stars seen at similar times and locations maybe we can scrape more trajectories automagically? This might be an ideal hack day project for .Astronomy. To plot many of these paths on top of each other would be interesting.

It’s fun to surmise that, given this Perseid’s path, it would have touched down in a field in North Devon. Good job it most likely disintegrated long before then.