SciBlogs

Archive August 2009

Personal space issues: now we know the ‘where’ aimee whitcroft Aug 31

No Comments
Caltech continues to live up to its reputation for producing some fine work.

In a recent coup (haha), they appear to have figured out the whole ‘personal bubble’ issue. Well, to be more precise, they’ve figured out which brain structure is responsible.

Fascinatingly, it’s the amygdala. This is the region in our brain responsible for feelings of fear, anger, and other strong negative emotions (I’m sure it was mentioned in the last episode of Fringe, actually).


It seems that lesions or other ‘damage’ or dysfunctions in the amygdala mean that the person in question is comfortable at closer proximities to others than is usually the case. In fact, it appears that, for some people at least, they have no sense of personal space at all – they can feel completely comfortable standing nose to nose! Which makes me wonder: do all the people in those chewing gum ads people have dysfunctional amygdalas?

Happily, the researchers also considered the role that culture may play in perceptions of personal space: I know from personal experience that these can vary widely, and consequently cause a fair amount of unease in the person used to a larger bubble. Apparently, they think that culture and experience may, over time, affect the brain and how it responds to situations (yes, the brain is plastic and learns…). Makes sense.

xkcd: Newton and Leibniz aimee whitcroft Aug 31

No Comments
I couldn’t resist. For the uninitiated, xkcd is a marvellous webcomic revolving around maths, physics, and, well, the human experience. In fact, its author, Randall Munroe, is even publishing a (real) book!
(subtitle: A webcomic of romance, sarcasm, math and language)

So yes: the strip itself:

Gadgets, Games and Geeks 09: The Future of Innovation, Shatter, Weta and pizza aimee whitcroft Aug 26

No Comments
A couple of days ago (Monday evening, to be exact), I attended GGG09 – Gadgets, Games and Geeks 09 (note: a logo would be a good thing, guys).

And yes, it was pretty interesting.

The highlight for me was Bill Reichert‘s talk, ‘The Future of Innovation: Entrepreneurship, Venture Capital and Emerging Technologies’ (if you’re interested, you can find the talk and accompanying slides on the SMC’s website, here). He’s a very engaging speaker, and had some great pieces of knowledge to impart.

Certainly it got me all inspired again about entrepreneurship – a subject close to my heart as it forms one of my qualifications. And, while much of it absolutely seemed like common sense, there were a couple of surprises, particularly the ‘change takes time’ point he makes (point 9 in his talk). Essentially, he says that we’ve talked ourselves into believing that the pace of change is accelerating, but that it simply is not the case. We simply need to look, he says, at how long it has actually taken us to get, for example, to high bandwidths and oodles of storage or, for that matter, electric cars which aren’t completely useless (or incredibly expensive). Other examples abound (really, have a look at his slides).

I also found his point about Twitter very interesting (towards the end of the session, in response to a question from the audience). He said that his issue with Twitter was simply that it gave entrepreneurs the wrong idea: that they could come up with a clever idea, get a few million ‘eyeballs’, and as a result make (lots of) money off it. After all, the jury is still out as to whether Twitter itself can make money, and how.

Having said that, it was definitely encouraging to hear that it’s not all doom and gloom – actually, a personal belief I’ve heard mirrored many times is that tough times actually enhance creativity by shocking everyone out of their bubbles. So we should have lots to look forward to.

(note: as usual, clicking on the logos will take you to the appropriate sites)

I also found Sidhe‘s talk very interesting (I have recorded it, and can put it up if requested – the slides can be found here). James Everett did a great job of explaining who Sidhe are, why they want more game developers in Wellington (amusingly, ‘because it’s difficult to poach from yourself’), and where they’re hoping to go in the future.

And I am definitely intrigued by there idea: shorten development times, shorten game lengths and bring down prices. Sounds like just my type of gaming. And Shatter really is very, very cool. Yes, it’s pong, but it’s new pong, and gosh is it pretty. If only it was available for PC…

Sadly, I found the talk by Tim Lauder of Weta Cave a little less thrilling than the previous two, although, as a lifelong fan of steampunk, I did enjoy the whole Dr Grodbert’s thang (I almost bought a lapel pin!)

I think my only real complaint was that I think there could have more exhibitors. I have some theories on why there weren’t (nothing I’ll air, of course), but it really would have been a wonderful way to showcase some more work. For example, I know a guy up in Palmerston North whose company, Unlimited Realities, has been developing Dell’s new touchscreen software

On the other hand, the enormous slices of pizza which rounded (haha) the evening off were brilliant.

So yes, here’s to GGG09, and hoping that GGG10 is even better!

Addendum to previous post aimee whitcroft Aug 25

No Comments
I found this delightful website (click on logo below)


JOVE, or the Journal of Visualised Experiments, is a fantastic idea. Anyone working in the biological sciences will, I’m sure, appreciate how tricky it can be to learn or duplicate a new methodology, and its this problem which JOVE aims to solve.

In essence, it’s a peer-reviewed, (although no longer OA, sadly) journal where the content is all video, rather than words.

The (threat) challenge to science publishing aimee whitcroft Aug 24

No Comments

An article I recently wrote for the Science Media Centre (note, there’s a really cool sound recording from the WCSJ on the SMC website, here). The debate’s a complicated one – this article just manages to lightly touch upon some of the issues…

The Open Access (OA) movement has been around since the 1990s — not surprising, as one of its principal tenets is that information should be freely available online. More specifically, it generally refers to scientific information, and in particular the information generally found in scientific journals. As we all know, this information is generally not freely accessible: rather, it is usually kept for access by journal subscribers, whether they be individuals or institutions.

The debate over whether scientific research should be freely accessible or not is a heated one, with very little signs of a resolution either way anytime soon. Its proponents say that freely available scientific research advances the cause and progression of science. Its detractors says that without journals (most of which are subscription-based), there would be no peer-review process, and hence no quality control. It’s not that simple, however.

Perhaps a good place to start is with the inevitable. Michael Nielsen has written a very clear article on the matter, entitled ‘Is scientific publishing about to be disrupted?’. In it, he argues very convincingly that scientific publishing (including journals) is about to experience the same upheaval that the newspaper/print industries have been experiencing. At the hands of the same phenomenon: the internet. And, just like newspapers, there is relatively little that can be done about the situation.

One of the most important, and perhaps noticeable, agents of this change is scientific blogging: blogs written by scientists about their own and others’ work.

As Nielsen writes:

’Let’s look up close at one element of this flourishing ecosystem: the gradual rise of science blogs as a serious medium for research. It’s easy to miss the impact of blogs on research, because most science blogs focus on outreach. But more and more blogs contain high quality research content.’

They differ greatly from published articles in that they allow scientists to engage in an ongoing conversation about their work and its developments, and are also a valuable means of engaging other scientists in a conversation about their work.

The movement is catching on to such a degree that numerous highly respected scientists are blogging, including Terry Tao, Tim Gowers, and Richard Lipton (list supplied by Michael Nielsen). On home ground, the New Zealand science blogging movement is also picking up pace: there are a number of blogs already in existence, and there are plans afoot to aggregate these bloggers’ work in a project called Sciblogs (based on ScienceBlogs).

’Scientific publishers should be terrified that some of the world’s best scientists, people at or near their research peak, people whose time is at a premium, are spending hundreds of hours each year creating original research content for their blogs, content that in many cases would be difficult or impossible to publish in a conventional journal. What we’re seeing here is a spectacular expansion in the range of the blog medium. By comparison, the journals are standing still.’ (Nielsen)

A main feature of the Open Access movement, however, is not necessarily to dissuade scientists from publishing journals (more on that later), or to encourage them to write blogs. Instead, it aims to encourage them to deposit copies of their published papers (pre or post-prints) in repositories which do give free access. Of these, ArXiv is particularly prominent, and has a fantastic physics blog.

A recent issue of the Australian (OA) journal SCRIPTed looks at the issue in a paper entitled ’Open Access to Journal Content as a Case Study in Unlocking IP’. The paper examines the accessibility of reviewed, published papers from examples of the different types of science publishers, including PNAS, Elsevier and a major division of the US NRC.

Interestingly, the paper finds that the lack of access to published papers is not, as one might assume, solely the fault of publishers. Instead, it found that the publishers’ copyright restrictions were (relatively) liberal, in many cases allowing researchers to place their work in repositories of one form or another. The primary reason for the lack of forward momentum was due to the researchers themselves. In the paper’s conclusion:

’The exploitation of the opportunity has lagged, because of impediments to adoption, especially the lack of any positive incentive to self-deposit, and downright apathy. The outcomes to date are disappointing for proponents of OA and Unlocking IP…OA and Unlocking IP in the area of journal articles are at serious risk of being stillborn’.

No doubt, this last sentence is one which would thrill many journal publishers. However, the OA movement and blogging are not the only movements which threaten journals. These previous examples have opposed journals in a relatively passive way — they are (generally) quite happy to co-exist.

There is a far stronger movement which is lining up against journals. This movement, written about in Times Higher Education’s recent article ‘A threat to scientific communication’ talks of growing unhappiness with publishing papers as the measure of a scientist’s success. An increasing number of (well respected) scientists, including the former editor of the British Medical Journal, says the influence of being published in the ‘major’ journals is far too powerful, and that journal metrics such as the Journal Impact Factor are actually an impediment to scientific progress.

’”(Journal metrics) are the disease of our times,” says Sir John Sulston, chairman of the Institute for Science, Ethics and Innovation at the University of Manchester, and Nobel prizewinner in the physiology or medicine category in 2002.

’Sulston argues that the use of journal metrics is not only a flimsy guarantee of the best work (his prize-winning discovery was never published in a top journal), but he also believes that the system puts pressure on scientists to act in ways that adversely affect science – from claiming work is more novel than it actually is to over-hyping, over-interpreting and prematurely publishing it, splitting publications to get more credits and, in extreme situations, even committing fraud.’

A further comment:

’Noting that the medical journal articles that get the most citations are studies of randomised trials from rich countries, [Richard Horton, editor of The Lancet] speculates that if The Lancet published more work from Africa, its impact factor would go down.

’”The incentive for me is to cut off completely parts of the world that have the biggest health challenges … citations create a racist culture in journals’ decision-making and embody a system that is only about us (in the developed world).”’

(Another problem cited is that the JIF, because it focuses only a few years, actually gives no indication of the long-term importance of scientific work.)

Embargoes are also coming under attack (see the recording at the bottom of this page), as it makes science seem more like an event than a linear series of incremental advances. This reminds me quite a lot of Professor Sir Peter Gluckman’s recent comments on the NZ media: what he said very closely matches this criticism, in that he feels that the New Zealand media fails to show science as a gradual process, instead showing it as a series of leaps forward. Which gave me cause to think: is it, then, actually the media’s fault? Particularly here in New Zealand, where many journalists are not able to specialise in science issues, and thus gain an understanding of scientific research’s continuity?

But I digress. In answer to the journals’ primary defense of their existence, the peer review process itself, there is also increased questioning of its use. Journal publishers maintain that the peer review process is the only real means of quality assurance for scientific research. The reactions to this include the following:

  • That peer review itself is generally undertaken for free, meaning that journals are taking free work and, essentially, selling it back to scientists.
  • The peer review process itself needs to have some of the following questions asked about: who actually does the reviewing? How appropriate are they? How strenuous is the process? And, of course, timing is also an issue (the process can take months, greatly slowing the speed at which research becomes known about).
    • In fact, this latter point brings to mind the recent debate over a paper published recently by well-known climate change skeptics, which attributes over 70% of climate change to the El Nino/Southern Oscillation weather patterns. While the paper was peer-reviewed, there have since been rebuttals (including this, yet-to-be-published paper) saying that the maths used was incorrect, and bringing into doubt the quality of the peer review undertaken on the original paper (I’m not commenting on either, please note).
    • [for more on peer review, have a look at this post, about the recent results of a survey into the matter)

Deep thought also has to be given to the tremendous amounts of research lost because it doesn’t come up with a result. There are two types of experiments which have no end results (and I speak from personal experience here): they were poorly set up, performed or analysed, or there simply are no results to be had.

While the first group should absolutely be ignored, the second can be very important to scientists. We used to say (in the market research consultancy at which I worked for a time) if our analysis turned up nothing that ’it’s a learning in itself’. And it often can be, either to prevent other scientists duplicating the same research (a huge waste of time and resources), or because there really is nothing there to see, which suggests that effort be focused in another direction.

The remedy for science publishing’s woes is unclear. While everyone agrees that there is a problem, or at the very least a challenge, nobody is sure what shape the future of science publishing will take.

Michael Nielsen says that scientific publishers need to become technology-driven if they are to survive (he mentions Nature as one of the few publishers trying this), and that they must do so even if it means fundamentally changing the way they currently work.

’In ten to twenty years, scientific publishers will be technology companies. By this, I don’t just mean that they’ll be heavy users of technology, or employ a large IT staff. I mean they’ll be technology-driven companies in a similar way to, say, Google or Apple. That is, their foundation will be technological innovation, and most key decision-makers will be people with deep technological expertise. Those publishers that don’t become technology driven will die off.’

And while it seems that the peer review process is likely to stay, it will no doubt change in form. It might well imitate what PLoS’s policy is, which is to check that the results can be substantiated by the methods and data, but not to worry about whether it is original or even important — this should be up to the world at large to decide.

Of course, something else to consider is this: if a paper is published in a repository or on a scientist’s own website/blog, and is then commented on by his peers…Is this not exactly what the peer review process is anyway? In that case, why be concerned with publishing?

However one looks at it, the industry is in for a massive upheaval: while it is uncertain just how, we can be sure that those trying to innovate to stay ahead of it may survive, but those that stand still will, like their newspaper counterparts, face extinction.

Note: The Royal Society of New Zealand conducted some research in journal use/publication in 2004. The results are here.

Further note: the original title has been changed to include ‘challenge’ after some commentary about whether ‘threat’ was, in fact, the correct word (I admit, it probably wasn’t the best)

Global warming warning aimee whitcroft Aug 21

No Comments
I’ve just come across this and, frankly, my first reaction was to burst out laughing.


Not, I hasten to add, because I think it’s funny in and of itself. And the article itself is very calm and cogent about things. It’s simply that the headline reads somewhat like an old, retro doomsday prediction.

And no, I’m not a denier. I’m not even a skeptic (although I would definitely say I’m a pragmatist). To be honest, it’s not something I feel that I am nearly well-informed about to comment on.

Happily, a number of other people are doing very fine jobs of commenting on the issue: Hot Topic is a great example, and the Science Media Centre tracks what’s happening in the coverage.

Gamma-ray bursts get even more sci-fi aimee whitcroft Aug 18

No Comments
Gamma-ray bursts really are the stuff of science fiction. And something of a mystery.


They’re the most impressive explosions the universe has been able to offer after the tremendous effort of the Big Bang. If one happened anywhere near Earth (and by near I mean within a thousand light years or so, so ‘near’ only a cosmic scale) and and was pointed at us, the radiation would kill all life here. Even the cockroaches (probably).

Up until now, prevailing thought had it that they were formed by the collapse into a black hole of a supermassive star. Now, however, a new theory is being considered (again): that they occur as the result of a black hole burrowing into the middle of a star and then consuming it. A sort of cosmic Alien, if you will.

Happily for us they’re directional, and it seems that they’re more likely to happen on the outer edges of the universe. Scientists have posited that this is because that’s where the older stars are, and the differences in chemistry between older stars and newer stars means newer ones are more likely to be prosaic about the matter and avoid the huge effort involved in producing a gamma-ray burst. Maybe. Another theory says they occur in regions with low metallicity (not a feature of the Milky Way, happily).

Either way, they’re fascinating beasts and something to keep an eye on (possibly one wearing sunglasses). For those of you interested, we apparently pick up about one a day…

More evidence that cockroaches may be the pinnacle of evolution aimee whitcroft Aug 18

No Comments
..Sorry, those of you who thought it was humans. (As a side note, Terry Pratchett is particularly hilarious on the subject in The Last Continent)


Alright, for those of you who prefer accuracy in your statement: no, they’re not really the pinnacle of evolution, for after all the theory somewhat precludes such a notion. But they are remarkably resilient. I had a few doozies in the kitchen of an ancient house I inhabited at one point in Cape Town, and I can confidently say that they’re well-nigh unstoppable, particularly if they’re big and old.

Back to the point of the post, though: it’s come to light (unlike the creatures themselves) that not only are they capable of withstanding nuclear fallout, but that they could also survive climate change. Apparently, they are able to hold their breath in order to conserve water loss – a particularly useful trait in Australia, where the research was conducted.

I’ll happily admit I quite admire them. They’re a brilliant example of how 250 million years of evolution can give one, if not backbone, then at least a pretty remarkable exoskeleton.

Sunbed silliness aimee whitcroft Aug 18

No Comments
I will admit to being slightly biased on this one, having heard many, many years ago about the entirely legitimate concerns about the cancer risks through UV exposure associated with their use.


And it’s become a rather heated (haha) issue here in New Zealand, particularly as it’s now been confirmed that sunbeds are, well, synonymous with cancer causation. The IARC is behind the research and they tend to know what they’re talking about (in fact, they’re the WHO’s agency committed to looking into human cancer).

Some countries have legislated around this, either by banning their use altogether, or by requiring that it is confined to adult use. In other cases, they allow teenagers to use them, but only if there has been adult consent.

But in New Zealand, none of this is the case. The sunbed industry here operates under a voluntary code (generally code for ‘pays lip service to’), which precludes people under 18 using the service. In addition, a spokesperson has said that people are aware of the risks, but choose to use the treatments anyway.

Now, however, an article on Stuff has shown that this isn’t the case at all. The investigation carried out as part of the article showed that most of the sunbed operators looked into showed no signs of abiding by the voluntary code: they allowed underage clients to use them, and in many cases did not tell people about the associated health risks or even warn them to use the goggles provided.

Not OK, guys, not OK at all. After all, having your clients die is generally accepted, even in our gung-ho world, as bad (or at least unsustainable) business practice.

Biomimicry and AskNature aimee whitcroft Aug 14

No Comments
Biomimicry is big right now. Or small. Or waterproof. Or able to run at full tilt (haha, considering the angle of activation is 10 degrees) across walls and even ceilings.


It’s a fascinating subject, and one of those which, like many great brainwaves, has that ‘well, duh’ component to it. Study nature’s designs to see how its been doing things before trying to reinvent the [insert item here]. Well, of course!

Still, apparently it’s something that has not been part of design dogma for some time. Silly us.

The gecko is definitely getting a fair amount of attention (see here and here, for starters) at the moment over its climbing ability, but nature abounds with brilliant things, and we’re learning more and more from then, as this talk shows.

But now, there’s this: AskNature, the site Benyus is working on. Because, of course, it’s one thing to say ‘well, study Nature before you design something’, but it’s an entirely different matter to attempt to first figure out what to study, and then how to access the data on it. Particularly if it’s academic data and you are not, frankly, an academic. The site is another fine example of how freeing information up, not shutting it away, benefits everybody.

Network-wide options by YD - Freelance Wordpress Developer