Archives For astronomy

I got into a conversation recently about how some astronomical photos can totally change your whole perspective of yourself and your place in the Universe. There’s several images that come to mind right away – here are my own favourites:

1. The Milky Way (from a very dark location)

Milky Way

Seeing the night sky from a dark site is something most people don’t do very often, now that most of us live in cities. The vision of the Milky Way overhead can be startling, and a pair of binoculars make it more so; revealing that its delicate structure is made of millions of stars. This long-exposure photo of the dust lanes in our galaxy [1] is our first image that can really change your perspective on yourself and your place in the cosmos.

2. Earthshine on a crescent moon

Young Crescent Moon with Earthshine

When the Moon is just a thin crescent in the evening sky you can often see the rest of its face, dimly lit, and slightly reddened. This part of the Moon is not being illuminated by the Sun, like the crescent shape itself, but rather by the reflection of light from the Earth where the Sun has not yet gone down over the horizon. You’re seeing other people’s daylight, bounced back at you from around the world [2][3].

3. Aurora and lightning from the ISS

Sometimes a change in perspective can be quite literal – as with this video of the Earth seen from the International Space Station. The green structures are aurora- the Northern Lights over Canada in this case. You can also catch the occasional flash of lightning. This time-lapse is haunting and shows you a view you could probably never otherwise see.

4. M31 compared to a full moon

m31abtpmoon

The Andromeda Galaxy is our nearest neighbouring galaxy and can be seen as a faint fuzzy patch in the Northern Sky. What is amazing though, is to realise that in fact it is quite a large object – bigger than our own Moon in our sky. Out eyes just don’t see it very well! Long-exposure images show just how big it really is. Combine this with the fact that it is 200 million light years away [4] and you begin to realise that the galaxy next door is truly enormous. It’s about the same shape, size, and type as our own Milky Way too. So we will look pretty similar to anyone looking up at the sky from a planet in the Andromeda galaxy.

5. Earth from Saturn (and other places)

PIA17172

There are perhaps no images quite as humbling and shifting as the set of images we would probably call the ‘pale blue dots’. These are the small set of mages of the Earth from far, far away taken by the robots we have sent out into the Solar System. Voyager 1 took one in 1990 from 4.7 billion light years away; Cassini has taken more than one from the Saturnian system (like the one above); a few have been taken from Mars too. All of them show the Earth as just a pixel or so across: encompassing all of humanity, the world, and all life as we know it into a teeny tiny speck against the cosmos.

6. Orion’s Proplyds

Orion_Nebula_proplyd_atlas

These dark blobs hidden within the star-forming complex of the Orion nebula are known as proplyds – or protoplanetary disks. These are embryonic solar systems in the making. Each of these blobs is far larger than our own Solar System (they get smaller as they evolve into spinning orbits) which gives you some idea as to how large the Orion Nebula is in total. We were once shrouded in such a dusty blob ourselves – though long before the Earth formed.

7. The Sloan Great Wall

SUTU_59

The largest surveys of galaxies reveal a structure in the Universe so vast that is practically beyond comprehension – but let’s try anyway shall we? The Sloan Great Wall is a filament of galaxies, snaking through the Universe that appear to be physically connected to each other – bound by gravity. The ‘wall’ is 1.38 billion light years across. That’s 1/67th of the observable Universe! When light is emitted on one side it doesn’t reach the other end for 1.38 billion years. It is 1,600 times a long as the distance between the Milky Way and Andromeda. I told you it was hard to imagine.

8. Apollo 8 on Christmas Eve 1968

AS8-14-2383HR

I thought it would be good to end on something a little closer to home. On December 24th 1968 astronauts Bill Anders, Jim Lovell, and Frank Borman were the voices heard on one of the most-watched television broadcast of all time. As they read passages from the Bible’s Book of Genesis, they broadcast a grainy image of the Earth, as seen from the orbit of the Moon. The world watched themselves from space for the first time, and saw the Earth as a singular marble, set against the deep black of space. The image has since been remastered and still represents an era, and a moment in human history, that many find totally perspective changing. A symbol of a race of beings from a tiny planet, venturing outward to explore space and the worlds beyond their own. Remarkable.


[1] I recently had my first go at some proper astrophotography from a dark site. My target was the Milky Way and the result was this image of the dust lanes of our galaxy toward the centre of the galaxy. I’m pretty happy with it for a first go.

[2] This effect can also be seen on other moons around other planets and is generically called ‘Planetshine‘.

[3] This also serves as a good reminder that there is a part of the Moon we never see – the far side – which is lit by the Sun, but just never seen from Earth.

[4] That distance gets smaller all the time, and Andromeda will actually collide with us in about 4 billion years.

publications

Executable papers are a cool idea in research [1]. You take a study, write it up as a paper and bundle together all your code, scripts and analysis in such a way that other people can take the ‘paper’ and run in themselves. This has three main attractive features, as I see it:

  1. It provides transparency for other researchers and allows everyone to run through your working to follow along step-by-step.
  2. It allows your peers to give you detailed feedback and ideas for improvements – or do the improvements themselves
  3. It allows others to take your work and try it out on their own data

The main problem is that these don’t really exist ‘in the wild’, and where they do they’re in bespoke formats even if they’re open source. iPython Notebook is a great way of doing something very much like an executable paper, for example. Another way would be to bundle up a virtual machine and share a disk image. Executable papers would allow for rapid-turnaround science to happen. For example, let’s imagine that you create a study and use some current data to form a theory or model. You do an analysis and create an executable paper. You store that paper in a library and the library periodically reruns the study when new data become available [2]. The library might be a university library server, or maybe it’s something like the arXiv, ePrints, or GitHub.

This is roughly what happens in some very competitive fields of science already – only with humans. Researchers write papers using simulated data and the instant they can access the anticipated data the import, run and publish. With observations of the Cosmic Microwave Background (CMB) it is the case that several competing researchers are waiting to work on the data – and new data come sour very rarely. In fact that day after the Planck CMB data was released last year, there was a flurry of papers submitted to the arXiv. Those who got in early, likely had pre-written much of the work and simply ran their code as soon as they had downloaded and parsed new, published data.

If executable papers could be left alone to scan the literature for new, useful data then they could also look for new results from each other. A set of executable papers could work together, without planning, to create new hypotheses and new understanding of the world. Whilst one paper crunches new environmental data, processing it into a catalogue, another could use the new catalogue to update climate change models and even automatically publish significant changes or new potential impacts for the economy.

I should be possible to make predictions in executable papers and have them automatically check for certain observational data and automatically republish updated results. So one can imagine a topical astronomy example where the BICEP2 results would be automatically checked against any released Planck data and then create new publications when statistical tests are met. Someone should do this if they haven’t already. In this way, papers can continue to further, or verify, our understanding long after publication.

SKA Rendering (Wikimedia Commons)

SKA Rendering (Wikimedia Commons)

This is high-frequency science [3], akin to high-frequency trading, and it seems like an interesting approach to some upcoming data-flow issues in science. The Large Hadron Collider (LHC), Large Synoptic Survey Telescope) LSST, and Square Kilometre Array (SKA) are all huge scientific instruments set to explore new parts o the universe and gathering huge volumes of data to be analysed.

Even the deployment of Zooniverse-scale citizen science cannot get around the fact that instruments like the SKA will create volumes of data that we don’t know what to do with, at a pace we’ve never seen before. I wonder if executable papers, set to scour the SKA servers for new data, could alleviate part of the issue by automatically searching for theorised trends. The papers would be sourced by the whole community, and peer-reviewed as is done today, effectively crowdsourcing the hypotheses through publications. This cloud of interconnected, virtual researchers, would continuously generate analyses that could be verified by some second peer-review process; since one would expect a great deal of nonsense in such a setup.

When this came up at a meeting the other day, Kevin Page (OeRC) remarked that we might just be describing sensors. In a way he’s right – but these are software sensors, built on the platform and infrastructure of the scientific community. They’re more like advanced tools; a set of ghost researchers, left to think about an idea in perpetuity, in service of the community that created them.

I’ve no idea if I’m describing anything real here – of it’s just an expression of way of partially automating the process of science. The idea stuck with me and I found myself writing about it to flesh it out – thus here is a blog post – and wondering how to code something like it. Maybe you have a notion too. If so, get in touch!

———-

[1] But not a new one really. It did come up again at a recent Social Machines meeting though, hence this post.
[2] David De Roure outlined this idea quite casually in a meeting the other day, I’ve no ice air it’s his or just something he’s heard a lot and thought was quite cool.
[3] This phrasing isn’t mine, but as soon as I heard it, I loved it. The whole room got chatting about this very quickly so provenance was lost I’m afraid.

zoosky

I’ve started a page with some links, facts and ideas for teachers, educators and anyone else that wants them. Quite often when I’m visiting schools, I throw lots of URLs around and talk about websites, books, etc that kids and teachers might like. Then I often forget to give them these URLs and tips. So now I’m putting them on a single page instead. Check it out here: https://orbitingfrog.com/astronomy-links-for-teachers/

Feel free to suggest additions to this page by contacting me online on via Twitter @orbitingfrog.

milkyway

Just over three years the Zooniverse launched the Milky Way Project (MWP), my first citizen science project. I have been leading the development and science of the MWP ever since. 50,000 volunteers have taken part from all over the world, and they’ve helped us do real science, including creating astronomy’s largest catalogue of infrared bubbles – which is pretty cool.

Today the original Milky Way Project (MWP) is complete. It took about three years and users have drawn more than 1,000,000 bubbles and several million other objects, including star clusters, green knots, and galaxies. It’s been a huge success but: there’s even more data! So it is with glee that we have announced the brand new Milky Way Project! It’s got more data, more objects to find, and it’s even more gorgeous.

Screenshot 2013-12-12 11.58.42

This second incarnation of my favourite Zooniverse project[1] has been an utterly different experience for me. Three years ago I had only recently learned how to build Ruby on Rails apps and had squirrelled myself away for hours carefully crafting the look and feel for my as-yet-unnamed citizen science project. I knew that it had to live up to the standards of Galaxy Zoo in both form and function – and that it had to produce science eventually.

Building and launching at that time was simpler in one sense (it was just me and Arfon that did most of the coding[2]) but so much harder as I was referring to the Rails manual constantly and learning Amazon Web Services on the fly. This week I have had the help of a team of experts at Zooniverse Chicago, who I normally collectively refer to as the development team. They have helped me by designing and building the website and also by integrating it seamlessly into the now buzzing Zooniverse infrastructure. The result has been an easier, smoother process with a far superior end result. I’ve essentially acted more like a consultant scientist, with a specification and requirements. I’ve still gotten my hands dirty (as you can see in the open source Milky Way Project GitHub repo) but I’ve managed to actually keep doing everything else I now to day-to-day at the Zooniverse. It’s been a fantastic experience to see personally how far we’ve come as an organisation.

The new MWP is being launched to include data from different regions of the galaxy in a new infrared wavelength combination. The new data consists of Spitzer/IRAC images from two surveys: Vela-Carina, which is essentially an extension of GLIMPSE covering Galactic longitudes 255°–295°, and GLIMPSE 3D, which extends GLIMPSE 1+2 to higher Galactic latitudes (at selected longitudes only). The images combine 3.6, 4.5, and 8.0 µm in the “classic” Spitzer/IRAC color scheme[3]. There are roughly 40,000 images to go through.

GLM_261.3032+00.8282_mosaic_I124

An EGO (or two) sitting in the dust near a young star cluster

The latest Zooniverse tech and design is being brought to bear on this big data problem. We are using our newest features to retire images with nothing in them (as determined by the volunteers of course) and to give more screen time to those parts of the galaxy where there are lots of pillars, bubbles and clusters – as well as other things. We’re marking more objects –  bow shocks, pillars, EGOs  – and getting rid of some older ones that either aren’t visible in the new data or weren’t as scientifically useful as we’d hoped (specifically: red fuzzies and green knots).

It’s very exciting! I’d highly recommend that you go now(!) and start classifying at www.milkywayproject.org – we need your help to map and measure our galaxy.

—–

[1] It’s like choosing between your children

[2] Arfon may recall my resistance to unit tests

[3] Classic to very geeky infrared astronomers

Astronomy in Everyday Life

November 6, 2013 — 3 Comments

Astronomers are sometimes asked to defend public funding of their work. It’s difficult to answer because I really do think that there are lots of things we should do just because they’re interesting and enriching and that science shouldn’t be limited be what is economically beneficial. That said, astronomy is often given an easy ride because it is pretty and we have people like Neil deGrasse Tyson, Brian Cox and Dara O’Briain on our side. One approach is talk about how much useful stuff astronomy has produced.

When you look around your life – and your house – you’d be surprised at how much is connected to astronomy and space exploration. Assuming you’re like me (i.e. living in the UK in 2013) you probably own several pieces of space-based technology. For a start you most likely use WiFi – in fact you might be reading this via WiFi right now! WiFi is based on work by John O’Sullivan working at CSIRO in Australia. The WLAN (Wireless Local Area Network) provided by your router results from technology developed by Radio Astronomers in Australia, More than a billion people are using it in 2013!

wificert

There’s also your GPS device. GPS determines your position by receiving the signals given off by a network of satellites orbiting the Earth. By comparing the time delay in the arrival of the different signals, the GPS chip can figure out its exact latitude and longitude to within about 10m. The GPS system not only involves satellites but each of those satellites houses an atomic clock and must incorporate Einstein’s equations for general relativity in order to know its position precisely [1]. It might be the most space-aged thing you own!

There’s a small chance that you sleep on a Memory Foam mattress or pillow. Memory Foam was created by NASA in 1966 (in fact it was created by people being contracted by NASA) to develop a way to better cushion and secure people going in to space [3]. Similarly iodine water filters derive from NASA work in the 1970s to create safe drinking water on long missions and scratch-resistant glass coatings were created to create better visors for astronauts.

Memory Foam

Contrary to popular belief, Teflon (the non-stick courting on saucepans) was not invented by NASA for the Apollo programme. In fact, it already existed and was simply used by NASA, who may have helped popularise it in industry at the time. I’ll also not mention CCDs here, since I’m no longer sure that astronomy had much to do with their success! [2].

Outside of your home, there are many other places where the technology results from space research. There is a great deal of medical tech that comes from space exploration, which shouldn’t be surprising given that both fields are often trying to see or detect things in tricky or unusual environments. Software for detecting things in satellite imagery is being applied in medicine, including to detect the signs of Alzheimer’s disease in brain scan data. The detection of breast cancer tumours was vastly improved by techniques in radio astronomy and instruments than began as ways to delicately monitor the temperature of fragile telescope instruments is being used in neonatal care today. At the airport the X-Ray scanner uses tech derived from X-Ray telescopes [4] and they may sometimes check your bag or coat for traces of certain chemicals by placing it in a gas-chromatograph which was originally designed for a Mars mission [4].

Astronomers are often also coders and software developers. As well being responsible for the 2008 banking fiasco (I’m joking, maybe) they are also good at creating software that others find very handy. The visualisation software IDL is many astronomers’ language of choice and was developed developed in the 1970s at the Laboratory for Atmospheric and Space Physics (LASP) at the University of Colorado at Boulder [5]. IDL is used in lots of research today in areas including defence, climate monitoring and by companies like Texaco, BP, and General Motors [6].

time

All of this is just the practical, modern stuff. Let’s not forget another thing you hold very dear: time itself. The calendar, especially it’s handy monthly segments, are astronomical in origin. The second, which seems so commonplace (i.e. it happens all the time) was defined in terms of the Earth’s rotation until astronomers realised that the length of a day was changing and so suggested a switch to defining it in terms of the Earth’s orbit around the Sun. Then we realised using an atomic clock would make more sense and handed our time-deiing powers over to the particle physicists [7].

Finally I just want to say that yesterday a paper appeared on the arXiv titled ‘Why is Astronomy Important?’ and it prompted me to finished this blog post about astronomy in everyday life, which I’ve had kicking around for ages. A big thanks to Marissa Rosenberg, Pedro Russo, Georgia Bladon, and Lars Lindberg Christensen for their timely paper – and their handy references!

UPDATE: There are also two handy booklets on this topic from the Royal Astronomical Society, you can find them here and here.

———-

  1. http://www.physics.org/article-questions.asp?id=55
  2. If you’re in to digital photography then you may have debated the benefits of the CMOS and CCD imaging technologies. All digital cameras, camera phones and webcams use one of these two types of tech. CCDs were developed in 1969 at Bell Labs (the 2009 Nobel Prize was awarded to its inventors Smith and Boyle) and they became very popular in astronomy. CCDs are said to have popularised by their use in the Hubble Space Telescope but I’m not sure I buy it and can’t find evidence for it.
  3. http://en.wikipedia.org/wiki/Memory_foam
  4. http://www.nap.edu/catalog.php?record_id=12951
  5. http://en.wikipedia.org/wiki/IDL_(programming_language)
  6. http://arxiv.org/abs/1311.0508
  7. http://tycho.usno.navy.mil/leapsec.html

A new journal begins today, Astronomy and Computing, covering the intersection of astronomy, computer science and information technology.

This journal is desperately needed in my view and I wish it every success. The timing is interesting as many people at the intersection of these research areas are skeptical of old-style journals and the current state of publishing in general. However, I look forward to reading it and maybe even submitting articles.

You’ll find it at http://www.sciencedirect.com/science/article/pii/S2213133712000029

That’s No Supermoon

June 24, 2013 — 4 Comments

The periodic mention of a ‘supermoon‘ in the news cycle is starting to annoy me. A supermoon is simply not that much bigger than any other Moon!  It’s apparently just perceptible but by no means would you call it ‘super’. Annoyingly though, observation of the so-called supermoon is wrapped up in another effect: the Moon Illusion. This means that people enthusiastically report seeing a really big Moon, but don’t realize that they would likely have thought it big on any other Full Moon night too.

So let me put my rant in some context. The term supermoon was coined by astrologer Richard Nolle about 30 years ago. It refers to a Full Moon or New Moon that occurs when the Moon is in the closest part of its orbit around the Earth. The Moon’s orbit is not perfectly circular and there is a closest point in every cycle (perigee) and a most-distant point too (apogee). At perigee the Moon is closer to the Earth by about 50,000 km (30,000 miles), which is enough to make the Moon appear slightly larger in the night sky. In fact it is about 1.1 times larger in it’s angular diameter on the sky. Expert Moon watchers can see a subtle difference but it’s pretty slight and hardly warrants the title of a ‘super’ moon.

The Moon's motion over one cycle.

The Moon going through one complete orbit as seen from the Earth.

This animated GIF shows a Moon going through one entire orbit (apogee-perigee-apogee) and you can see the changing size (you can also see it undergoing libration, which is the wobbling motion). You can a direct size comparison below. In both these cases you’re seeing it close-up – imaging these things hanging in the sky at a distance. The size change is happening in every cycle, but is most prominent when the Full Moon coincides with perigee, as was the case this week.

Size Comparison for the Moon at Apogee and Perigee

Size Comparison for the Moon at Apogee and Perigee [Source: http://www.fourmilab.ch/earthview/moon_ap_per.html%5D

So there is a difference in the appearance of the Moon but it is very small and you’re unlikely to be seeing this when you go outside to look at a supermoon. What you’re actually experiencing is most likely the Moon Illusion: the optical illusion that the Moon looks larger when it is near the horizon than when it is high in the sky. The Moon Illusion is not well understood but most astronomers are very familiar with it. It may be partially caused by the Ebbinghaus illusion, which is the one that makes the two central circles in the following image appear to be different sizes when they are, of course, the same. When close to the horizon the Moon is compared to objects like rooftops, hills and clouds. When high in the sky is mostly seen in wide-open space. Another explanation may lie in the processes that govern our binocular vision; it might be that the Moon Illusion does not occur of you stand on your head, for example. This has not (yet) been tested widely.

Ebbinghaus Illusion

So what happened over the weekend was that people heard about a supermoon and so went outside to see it. Given that it any observable supermoon is a Full Moon, this means people went out to see it when it was low down in the sky, because Full Moon’s rise late in the evening. Thus they probably experienced the Moon Illusion and reported that indeed the Moon looked very large.

On a final point: the supermoon is also given silly superpowers by some new outlets too. The natural oscillation of the Moon’s distance does indeed affect tides a little, but it does not cause earthquakes, madness or werewolves.

Image Credit: Jack Newton

Image Credit: Jack Newton

There’s a cool paper on arXiv today in which an intrepid band of astronomers (I assume they were/are intrepid) search for exoplanets around the stars in the Pleiades using Subaru. Spoiler alert: they don’t find any! However, it’s an interesting look at how to hunt for planets and small/faint objects in general.

They find 13 potential planet candidates around 9 stars. 5 of these were confirmed as background stars and two more are dismissed because they either didn’t appear in all data or the data that did appear in wasn’t good enough. Two more were found to be known brown dwarves, with masses 60x the size of Jupiter. The remaining 4 candidates still await further data to confirm their motion across the sky – but aren’t though to be planets either.

By not detecting any planets with a very sensitive instrument they are able to estimate an upper-limit for the frequency of such planets around stars in the Pleiades. So by not finding planets, they learn something really interesting. Well done, science.

ESA’s Planck mission reported results today showing the Cosmic Microwave Background (CMB, see below) in greater detail than ever before.

image

Planck achieves this amazing view of the earliest light in the Universe by combining and cleverly cross-matching data across a combination of 9 different frequencies, ranging from 30-857 GHz. In this way they can remove foreground emissions and effectively strip away the content of the whole Universe, to reveal the faint CMB that lies behind it. It’s amazing work.

To accompany the announcement, Planck have released a Chromoscope-based version of their full data set here. This site shows all 9 bands (plus a composite image and the visible sky for reference) and lets you slide between them, exploring the different structures found at different wavelengths.

image

image

You can rearrange the different bands and turn on useful markers like constellations and known microwave sky features. It’s just great!

Also! There is also an option to view the data in terms of the content – or components – of the Universe. You can see that version here. You can switch between these views using the options box on the left hand side.

In this version of the site you’re able to see the different structures that contribute to the overall Planck sky image. This is how you can really start to understand what Planck is seeing and how we need to ‘extract’ the foreground emission from the data. In this view you can look at the dust, the emission purely from Carbon Monoxide (a common molecule at these wavelengths), the CMB itself and the low-frequency emission from elsewhere (such as astronomical radio sources).

image

image

Cardiff’s Chris North has put this site together (you can find him on Twitter @chrisenorth) and it was Chris, along with Stuart Lowe and I that first put Chromoscope together many moons ago now. I can’t take much credit for Chromoscope really but it’s fantastic to see it put to use here.

This is the wonderful blend of open science and public engagement that I love, and that astronomy is getting better at in general. What Planck are doing here is making the data freely available in a form that is digestible to the enthusiastic non-specialist.

This sort of ‘outreach’ is enabled by the modern web’s ability to make beautiful websites relatively painless to build and cheap to host. It’s also possible because we have people, like Chris North, who know about both the science and the web. Being comfortable on the Internet and ‘getting’ the web are so important today for anyone that wants to engage people with data and science.

So, go explore! You can zoom right in on the data and even do so in 9 frequencies. There is a lot to come from Planck – as scientists get to work pumping out papers using these data – so this site will be a handy reference in the future. It’s also awesome: did I mention that?

[URL: http://astrog80.astro.cf.ac.uk/Planck/Chromoscope/]

I was at RAL today, as part of a teacher training event run by the National Space Academy, to talk about the Zooniverse and how our projects can be used to teach astronomy, science and maths.

I gave an overview of the Zooniverse, and then highlighted ZooTeach, the dedicated website where teachers and educators can create and share lesson plans, centred around the Zooniverse citizen science websites.

ZooTeach is great and will only get better as more teachers know about it and use it. Check it out at http://www.zooteach.org and follow on Twitter at @ZooTeach.