Archive February 2010

Tsunami – if the big one hit Peter Griffin Feb 28


New Zealanders in coastal areas are right now preparing for potential wave surges as tsunami warnings are in force for the entire coast of the country.

This follows the 8.8 magnitude earthquake in Chile last night – my colleagues at the Science Media Centre rounded up some analysis of the quake from British scientists last night.

Preparedness for possible tsunamis has been good this time, following a patchy response from Civil Defence in the wake of the earthquake late last year which triggered a tsunami off the Samoan islands. It looks like New Zealand will escape any major damage from this tsunami, but a tsunami originating closer to home or resulting from a higher magnitude quake as far away as South America could have devastating consequences for the country, according to a tsunami risk assessment by Civil Defence updated just a few days ago.

As the report states:

A 2005 review showed estimated damages of $12-21 billion nationally from a 500-year return period tsunami — approximately 10% probability of occurrence in 50 years, or annual probability of 0.2%.

The risk in terms of mortality in the 19 urban centres assessed in the review (for the same return period) arises from losses in many towns and cities in New Zealand, but is predominantly from those along the east coasts of the North and South Islands as a result of large earthquakes in South America (about 60% of total deaths) or along the Hikurangi subduction margin of the eastern North Island (about 34% of total deaths). Tsunamis generated by offshore local faults are likely to account for 5% of deaths and those generated by regional sources 1%.

So what are the chances of a massive earthquake triggering a one in 500 year tsunami from the South American region? Well, researchers up until now have really had the massive 1960 earthquake in Chile to model things on:

…the 1960 tsunami, although caused by a much larger earthquake than the 1868 event (magnitude 9.4, possibly 9.5), occurred on a part of the South American plate boundary that is not as well oriented to New Zealand as the 1868 location. It produced a smaller tsunami in New Zealand than would have occurred had the location been ideally oriented.

Nevertheless, the 1960 tsunami caused run-ups of up to four metres in parts of the North and South Islands. The magnitude of the 1960 earthquake probably represents the upper limit for earthquakes for the whole South American coastline (and worldwide).

Computer models (Power et al., 2004), combined with historical observations, suggest that South American earthquakes with magnitudes less than 8.5 generate a minimal risk of a damaging tsunami in New Zealand. The historical record of Peru and Chile, which is hundreds of years longer than New Zealand’s, indicates that large earthquakes and tsunami have occurred relatively frequently in the past 450 years. Nine earthquakes with estimated magnitudes >8.5 caused nearsource run-up heights near to, or greater than, those produced locally by the 1868 event, and hence probably produced significant tsunami in New Zealand prior to
European settlement.

The Civil Defence report notes that while there has been little damage and few deaths from tsunamis in New Zealand since European settlement, Maori traditions make note of tsunamis that killed many people. Potentially then, the devastation in the last 1000 years or so was caused by larger that 8.5 magnitude quakes in South American that triggered tsunamis.

civil 500 2

A presentation from Victoria University’s Dr John Townsend on the Chilean earthquake:

A beeping good idea for low-cost communication Peter Griffin Feb 25


Yesterday was intense. As a judge on the preliminary round of Microsoft’s Imagine Cup, which pits teams of university students against each other in a bid to find the top four most innovative and potentially world-changing projects, it was a blur of powerpoint slides and Dragon’s Den-style questioning as we got through twenty 20-minute pitches in the course of the day.

MS tags - potential to replace paper receipts

MS tags - potential to replace paper receipts

The idea of the competition is for students with software, engineering or technical backgrounds to come up with solutions aimed at tackling the world’s big problems – the UN’s Millennium Development Goals were given to contestants as an indication of the big issues they should be seeking to tackle.

As such the projects fell quite neatly into a few categories – reducing our carbon footprint, improving the efficiency of health services, getting better cut through with education programmes and fighting exploitation of children, especially in third world countries.

There were some big hairy audacious plans – one team outlined a plan to buy old transport ships and fit them with desalination plants roving around the coast of India purifying water as regions suffered drought, water shortages or natural disasters.

Another came up with a great concept to get rid of thermal paper receipts which are used by a large number of retailers but which are not recyclable, creating a huge amount of waste and killing millions of trees unnecessarily. The idea involved using a new type of technology Microsoft has developed, an MS tag, which is like a souped-up barcode which features a nice picture that can be scanned. The MS tag could be swiped at the check out at a supermarket allowing the customer’s receipt to be sent to an online portal instead of being printed out. You could then get some nice analytics online about your supermarket spending and redeem discounts by printing out individual coupons or displaying them on your mobile phone screen and having them scanned when you return to the store.

But the teams had to not only have a big concept, but demonstrate that they were some way down the track to making it a reality, which led us to our top four finalists, details of which are below.

Team One Beep was the highest scoring team, with a solution to a big problem that’s ingenious in its simplicity and has already proven to be technically possible. The idea is to leverage off the growing number of XO laptops available in third world countries as part of the One Laptop Per Child (OLPC) programme which is active here in New Zealand where a network of testers including Sciblogger Fabiana Kubke help refine the low-cost computer’s features.

OLPC + FM rad = lessons beamed to computers

OLPC + FM radio = lessons beamed to computers

There are 1.2 million OLPC laptops now in use, but there remains a big problem – third world countries don’t really have the communications infrastructure to get content out to those laptops in a reliable fashion. Mobile networks often don’t extend into rural areas, satellite and fixed wireless systems are too expensive options. Team Beep came up with a great idea – why not use the readily available FM broadcast frequency to send out a stream of data that can be picked up by a bog standard FM radio. The signal is then fed into the sound card of the XO latop and recorded using a small piece of open source software. The software then converts the audio signal, which consists of a stream of beeps representing letters, into text and assembles it as a document.

Radiotext-type services using the FM network are not a new idea, here’s one project from Europe seeking to offer similar services and digital radio is already rolled out in many countries delivering weather, traffic and channel information to radio users. But the innovative part of One Beep’s solution is the interface between an FM radio and the XO laptops used as part of the OLPC programme. With some refinements, this should be a piece of software that cna be simple to use and allow children in remote villages in Africa to be sent school lessons updated regularly.

Currently, the data throughput One Beep is achieving is fairly low – 2Kbps (kilobits per second). But the team is confident compression technology can increase this to 10Kbps. I think they’d find others are working in this area who may be worth partnering with to get the data throughput possible via FM radio even higher.

This is a solution that could be rolled out tomorrow – it requires use of a small sliver of radio spectrum, a radio transmitter to send out the signal (the further it needs to go the more powerful the transmitter needs to be) and the software has to be installed on each OLPC machine. Hopefully the competition and One Beep’s making it to the final will give the project the profile it needs to become reality.

The four New Zealand finalists listed below will now compete in another round of judging and give presentations in front of an audience of 500 people in April with the winning team then heading to Poland to take part in the world champs.

The Imagine Cup finalists:

Team One Beep — From the University of Auckland, Team One Beep developed a system for delivering data over conventional radio transmitters; the purpose of which is to enable educational material to be delivered to impoverished schools and communities in areas of the world where there are no phone lines let alone internet services.

Team Enpeda — Also from the University of Auckland, Team Enpeda devised a working prototype of a computer controlled driver assistance system. It uses a cell phone camera and is able to detect the road environment ahead and warn drivers if they stray off course and into danger.

Team eUtopia — From the University of Waikato, Team eUtopia came up with a live video distribution service that links conservation organisations to the public and allows for remote monitoring, private research and even surveillance of animals.

Team Vital Link — From the University of Auckland, Team Vital Link tackle the issue of poverty, in particular, fair trade for artisans in impoverished countries whose handicrafts are often undervalued. The team aims to provide a global marketplace by capitalising on the viral marketing capabilities of Facebook to help these people make enough money to improve their daily lives.

That's me in the middle in the blue shirt with the stupid grin alongside fellow judges and the finalists of the Imagine Cup

That's me in the middle in the blue shirt alongside fellow judges and the finalists of the Imagine Cup

Technology’s Holy Trinity: Sex, bombs and burgers Peter Griffin Feb 23

1 Comment

My former New Zealand Herald colleague, Peter Nowak, now firmly ensconced back in Toronto at the Canadian Broadcasting Corporation, has just published his first book and its based on an intriguing premise.

burgers coverNowak has been poring over the annals of 20th century innovation and come to the conclusion that war, porn and fast food created a good deal of technology as we know it.

Chapter 1 of the book Sex, Bombs and Burgers is free to download and sets the scene well outlining how the humble kitchen microwave has its origins in the technology invented by the Britsh and developed by the Americans to create radar for military use. That application of technology changed the course of the war, but in the post-war world, spawned a consumer cooking revolution still going strong:

The age of cooking convenience had finally arrived in the home. The microwave oven was the perfect invention for a postwar society that put a premium on speed. With an increasing number of families seeing both parents going off to work, spare time was becoming more and more precious. The microwave helped facilitate those busy lives.

Chapter 1 also looks at the development of Teflon, which was discovered by accident in the 1930s and developed by chemical giant Du Pont as a sealant in its plutonium plants – Du Pont was a major player in creating the US military’s nuclear capability during the war.

We also learn that in the run up to World War I, Germany, forbidden from stockpiling materials that could contribute towards military applications, came up with synthetic materials that would become crucial to the war effort in Nazi Germany as Hilter’s supplies of oil and rubber were cut off. German industrial giant Farben was key in this synthetic materials development, which extended to manufacturing the deadly Zkylon B gas used to kill thousands of concentration camp prisoners.

Several of today’s largest multinational firms owe part of their post-war successes to the often ill-gotten intellectual property inherited from Farben, including film manufacturer Agfa-Gevaert, chemical maker BASF and pharmaceutical companies Sanofi-Aventis (derived from a merger of Farben spin-off Hoechst and France’s Rhône-Poulenc Rorer) and Bayer.

And a, in hindsight, fairly creepy comment from Adolf Hitler:

’It is the task of science to give us those materials nature has cruelly denied us. Science is only free if she can sovereignly master those problems posed to her by life.’

We haven’t got to the porn and fast food yet, but there are plenty of hints on the internet as to the ground Nowak is likely to cover. Check out this PC World article – 12 ways the sex trade has changed the web. Online payment systems and web video streaming are just two innovations with their roots in porn.

Ironically, we are also seeing consumer technologies, such as video games becoming the tools of innovation for the military as games are used to sharpen the skills of soldiers.

Nowak’s book is in bookstores here next month. In the meantime you can check out his blog, where he has a running commentary on the link between technology and boobs, bombs and burgers.

Network failures – working out the compensation Peter Griffin Feb 23

No Comments

I was in Wellington bar Matterhorn this time when the XT network failed. The first I knew of it was a text message from my boss asking if my phone was able to make calls, as she had been unsuccessful in calling me from her XT handset. Sure enough, a few attempts to make calls failed and while I was able to send text messages, the outage stretched on into the evening for around three hours.

Since then I’ve spoken to several senior business people who have quite genuinely told me they are heading for the exits where Telecom’s XT network is concerned – and that will either mean a move back to the CDMA network or a move to Vodafone. The blood on the floor at Telecom and Alcatel Lucent may go some way to placate corporate customers furious at the numerous outages that have occurred, but money talks and only serious compensation will keep those wavering onboard.

So how do you decide how much is fair compensation for a network outage? Well, there’s actually a scientific equation for calculating such things that has been proposed by researchers. The paper Game Theoretic Outage Compensation in Next Generation Mobile Networks was published in May last year in the journal IEEE Transactions on Wireless Communications proposes a game theory-based compensation algorithm.

Here’s the paper’s abstract:

Changes in network dynamics (e.g., link failure, congestion, buffer overflow and so on) may render ubiquitous service access untenable in the Next Generation Mobile Network (NGMN). Since the commercial viability of a network in a competitive market depends on the perceived user satisfaction, to atone for the loss of the guaranteed service access, it is desirable to compensate the users either with future quality-of-service (QoS) enhancements and/or price reductions.

Focusing on the price reduction aspect, this paper proposes a non-cooperative game theory based compensation algorithm that derives the best outage compensation (i.e., price reduction for the outage period t) over different service types. Taking into account all-IP based applications in the future, the service types are categorized into different classes such as flat rate based (i.e., cents for the entire session(s)), time based (i.e., cents per minute), volume based (i.e., cents per MB), etc., whereupon the compensation algorithm is translated into an n-player game, based on the current subscription profiles. With step sized cost reductions, the selection of the outage compensation is governed by the Nash equilibrium points and fairly allocates cost reduction among the ongoing service types.

Telecom offered up $5 million in compensation to customers following its second major XT outage. Whether that was a mathematical calculation based on any compensation model hasn’t been revealed, but it is likely that any further compensation will be over the odds on anything an algorithm can come up with…

RNZ woes – what does it mean for science coverage? Peter Griffin Feb 19


If you work in or near the media, you’ll have heard the nervous buzz from journalists over the last few days about the depressing options facing Radio New Zealand if it is to cut costs and stay within its flat, $38 million budget.

The Journz mailing list where a smattering of Kiwi hacks hang out has been fairly downbeat on the subject today and the Save Radio New Zealand Facebook group has been racking up the numbers – nearly 4,600 members so far.

If you listen to Radio New Zealand across its programming from Morning Report and Checkpoint to Nine to Noon and This Way Up, you’ll notice that the station has a strong commitment not only to quality content and thorough, balanced journalism, but that science and environment stories get a lot of attention. In fact, it wouldn’t be a stretch to say that RNZ probably has the biggest commitment to science and environment coverage in the country – which is exactly how it should be for a public broadcaster.

On the Our Changing World show it employs two producer/journalists Alison Ballance and Ruther Beran (on loan from Australia’s ABC while Veronika Meduna is on maternity leave), who produce a solid show every Thursday night looking in some depth at the scientific research that is going on in New Zealand. Ruth did an excellent piece recently looking at some of the problems the Australian Synchrotron has been facing of late which is important to New Zealand as we are an investor in the project.

Kim Hill’s Saturday morning line-up of guests includes a good number of scientists and Hill, a companion of the Royal Society of New Zealand has done a huge amount for local science communication efforts with her interviews with the likes of Professor Sir Paul Callaghan. Her interview last year with 9-11 conspiracy theorist, Richard Gage, was controversial but showed a sceptical and science-literate mind at work.

Source: NZ Herald

Source: NZ Herald

Elsewhere, the commitment to science and environment coverage is actually on the increase. In the general newsroom, there aren’t strictly speaking any reporters who get to focus on science all the time, but new hire Will Hine, formerly of the Southland Times has been given the go-ahead to devleop the round and has hit the ground running. It heps that seasoned science writer Kim Griggs is employed in the RNZ newsroom and helps set the news agenda.

Ian Telfer keeps a close eye on the environment beat and other reporters such as David Reid and Heugh Chappell regularly pick up science-related stories. Kathryn Ryan seems to be having a growing number of scientists on Nine to Noon and Bryan Crump in the evenings regularly conducts 10 – 12 minute interviews with scientists. Over the weekend you’re likely to hear a bit of consumer-focused on the This Way Up show with Simon Morton and Chris Laidlaw regularly looks at science and environment related stories in his panel discussions.

To cap it off, the Royal Society has a popular science lecture series that is broadcast on Radio New Zealand each year. So from my perspective as head of an organisation trying to encourage science communication in this country, RNZ is a bastion of decent coverage – from three minute reports through to half-hour documentaries. The thougth then of management being forced to take the knife to formats that work and are delivering good quality journalism, is pretty depressing.

In the world of commercial media, the first things that get cut in terms of editorial costs are areas of specialist coverage. Newspaper sections shrink or disappear entirely. Specialist commentators are laid off in favour of generalists who can churn out words on any subject. The result of that, I think, has been a real erosion in the quality of commentary coming from the mainstream commercial media. Even the pundits themselves are beginning to realise that. Take Tracey Barnett’s revealing column in today’s Herald in which she admits:

Someone finds a way to start the news narrative and like clueless lemmings, we all jump into the same plotline to finish each other’s sentences, clinging to page one. You don’t notice it when you’re a daily reader. But when I returned to it with fresh eyes, I saw entire waves of news narratives that felt hopelessly unimportant to any sane man’s idea of the big picture.

Ironically, her column sits beneath another woeful blast of hot air from Jim Hopkins on the subject of Climate Change, where Jim lemming-like parrots the contents of a flawed Daily Mail article. Hot Topic’s Gareth Renowden explains where Jim went wrong – again.

The worst thing Radio New Zealand could do is chop programmes that actually allow journalists to work on longer-form stories, shows like Our Changing World and This Way Up which I occasionally contribute to. Just when our state broadcaster is ramping up coverage of important science and environment issues, it would be a huge leap backwards to discard this type of coverage, in favour of the generalist approach of the commercial media.

Sure, our public broadcaster has to be sustainable and focused on containing costs, but I think there’s an increasing awareness of the fact that RNZ sets the standard for the media in general in this country, particularly when it comes to science and environment reporting.  That’s something worth maintaining.

20 years of DNA forensics in New Zealand Peter Griffin Feb 17

1 Comment

In the age of CSI-style crime dramas it is difficult to imagine a time when DNA evidence wasn’t part of any criminal investigation.

But the first court case in New Zealand where DNA evidence was presented happened only as recently as 1990 and on the world stage, the first case involving DNA profiling went to court in the United Kingdom in 1986.

DNA can distinguish one individual from another. Photo credit: ESR

DNA can distinguish one individual from another. Photo credit: ESR

One of the scientists working at the time for the UK’s Forensic Science Services, and who was instrumental in developing the DNA profiling techniques that featured in that 1986 DNA profiling was built, is Dr Peter Gill, now a senior lecturer at Strathclyde University in Glasgow and a world-renowned expert in DNA forensics.

Dr Gill is in New Zealand this week visiting colleagues at Environmental Science and Research, the Crown research institute which does all of the New Zealand Police’s forensic work.

In this briefing at the Science Media Centre today, Dr Gill and ESR scientist Dr Sally Ann Harbison took journalists through the big scientific and legal developments of the past 20 years in the field of DNA forensics. The ESR booklet below is a good backgrounder to the New Zealand-specific developments.

Banking on DNA

One of the major developments in the evolution of DNA forensics both here and in the UK was the establishment in the mid nineties of DNA databanks that hold profiles of known criminals which can be compared against samples of DNA – usually blood, semen and saliva, taken from crime scenes. In fact New Zealand was the second country after the UK to set up a DNA Databank. Swabs are taken from crime scenes and analysed, then a DNA profile – a series of numbers that identifies a person’s unique DNA make-up, is created from the DNA material gathered and stored electronically in the DNA Databank.

Interestingly, the New Zealand DNA databank has over 120,000 DNA profiles in it, covering around two per cent of the population while the UK DNA databank formed around the same time has five million profiles, some 12 per cent of the UK population. Here’s some figures from ESR on the make-up of the DNA Databank and the success rate in matching profiles to crimes:

• over 105,000 DNA profiles from individuals
• over 23,000 DNA profiles from crime samples
• Link rates of 65% crime to person and 34% crime to crime
• Over 15, 000 links reported to date.

Despite the larger proportion of the UK population being covered by the UK DNA Databank, due to the UK police having wider powers to obtain DNA samples from suspects arrested for “recordable” offences, the crime to person link rate is higher in New Zealand than in the UK (somewhere around 53 – 55 per cent, I am told).

At an event at the Beehive tonight, when the Minister for Research, Science and Technology, Dr Wayne Mapp, enquired as to why this was, ESR scientists explained that the police generally have a good idea of who the criminals are anyway – and presumably have been able to obtain DNA samples from them along the way.

Nevertheless, the New Zealand Police from July will have wider powers,  under the Criminal Investigations (Bodily Samples) Amendment Act 2009 to take DNA samples from people who are charged with various offences. That could lead to more DNA profiles going into the DNA Databank.

Mission creep?

DNA Databanks exist around the world – the largest being in the US. Legislation governing who DNA samples can be taken from and when varies from country to country. Interestingly though, the UK’s DNA Database has become increasingly controversial as the number of DNA profiles of innocent people (those arrested on recordable offences but not prosecuted) rises to around one million of the five million DNA profiles on record.

Last week, one of Dr Gill’s colleagues who was integral to developing DNA forensics techniques, Sir Alec Jeffreys, voiced his criticism of the UK’s DNA Databank saying the profiles of innocent people should be removed from it:

‘Innocent people on the database are being used inefficiently to solve future crimes – and that goes against their civil rights.

‘If you took one million profiles off the database and replaced them with one million randomly selected profiles, would detections rise?’

Despite the concerns of critics, there’s no denying that storing DNA profiles so they can be compared with DNA taken from crime scenes has resulted in a lot of criminals being identified, cold cases being solved and innocent people being exonerated. As such then, the linking of DNA databases across borders would be increasingly useful for solving crimes as criminals become more  mobile. Dr Gill says such linkages are underway, particularly across EU countries and he envisages a sort of Google search engine for DNA databases that allows several DNA databases to be searched by law enforcement agencies in any one country.

Google’s fibre play – the key to NZ internet woes? Peter Griffin Feb 12

No Comments

Google’s corporate blog has already been the source of one jaw-dropping announcement this year – the internet giant’s promise to stop filtering results on the Google China web domain, risking its entire business in China.

Now we have Google revealing plans to invest in a trial fibre-to-the-home network in the US that will reach 50,000 to 500,000 customers and offer connection speeds of up to 1Gbps (gigabit per second).

Anyone who has been following the company’s forays into investing in transoceanic cable infrastructure, a free city-wide wi-fi network covering its home town of Mountain View and its bid to open up mobile spectrum and unused TV frequencies won’t be too surprised by this, but the scale of Google’s plans are unprecedented. We now have a cash-rich, leading internet content company building high-capacity infrastructure designed to supply access to the internet. Microsoft has been down this path on a less ambitious scale when it put US$1 billion into cable operator Comcast back in the late nineties.

“Our vision for connecting the world of PCs and TVs has long included advanced broadband capabilities to deliver video, data and interactivity to the home,” said Microsoft founder Bill Gates at the time.

“Comcast’s integrated approach to cable distribution, programming and telecommunications complements that vision of linking PCs and TVs. Today’s announcement will enhance the integration of broadband pipes and content to expand the services offered to consumers,” he added.

Microsoft’s visions of merging the PC and TV didn’t really take off, at least until it debuted its gaming console the Xbox, which now through the Xbox Live service can supply on-demand video services to the lounges of millions of Americans. Google on the other hand has a suite of highly successful online services and a major revenue generator in its online advertising business. With Youtube, Google Earth and Streetview it has indicated a desire to deliver richer applications which require higher bandwidth speeds to access. The only limit now to Google’s ambitions (and profitability) is a lack of bandwidth, which is why the company is now getting serious about owning internet infrastructure.

Comcast and AT&T will be very nervous about this development – with US$25 billion in cash reserves, Google is far more able to invest in capital expenditure than any of the traditional telecoms players. But is Google a threat or a potential partner, particularly for less well-funded players in the US and abroad?

Innovation booster?

Google is promising to keep its network open-access so will partner with internet providers who wish to offer services over it – a similar sort of plan to what the New Zealand Government is undertaking with its next-generation broadband network which it is putting $1.5 billion into.

While Google is investing in undersea cables across the Pacific and in fibre networks linking its data centres across the US to lower its significant international bandwidth costs, its fibre-to-the-home service is being pitched as an investment to stimulate innovation in online services, rather than a desire to generate revenues from providing wholesale internet access. As such it embodies the argument that “build it and they will come” – new and existing players will innovate and create cutting-edge new services to fill the fat pipes Google lays.

Here in New Zealand, critics of the Government’s fibre plans are sceptical of this approach and suggest there isn’t really demand for the types of services that need 100Mbps (megabits per second) – the Government’s targeted speed for fibre services for the majority of the country within a decade, which incidentally, is just a tenth of the bandwidth capacity which will be incorporated into Google’s trail networks. However, Youtube and Google Earth themselves are prime examples of services that emerged to fill the capacity made available by faster internet access speeds. Who knows what could already be in the pipeline at Google and other internet companies that could revolutionise how we communicate or use the internet once higher access speeds are available.

The ultimate public-private partner?

For countries like New Zealand, which have limited funds to invest in broadband infrastructure, Google would be a dream partner. The Government will partner with potentially dozens of private players to build out its fibre network but very few that are totally focused on stimulating innovation in internet services as Google is.

New Zealand has long been a favoured testing ground for new wireless technologies. If it put up its hand to become a large-scale test scenario for a Google fibre network outside of the US, it could show the world what can be achieved when a country moves to truly high-speed internet access.

Where Google could also play a role is as an investor in a second undersea trans-Tasman cable linking Australia and New Zealand. For a modest investment, Google could ensure connectivity to millions of customers for access to its own services, but also stimulate innovation in web services on a national level by wholesaling international access across the telecoms industry – imagine the difference in web usage patterns if data caps were removed due to more competitive international bandwidth pricing. The Government or a private/public consortium could do well to put an innovative proposal to Google about this in the absence of any firm plans for construction of a second cable.

The world still needs Wikileaks Peter Griffin Feb 02


In the days before the internet journalists the world over hoped and prayed for the day when the plain manila envelope turned up in the post crammed with sensitive documents sent by a whistleblower.

Wikileaks: The motherlode for journalists

Wikileaks: The motherlode of leaked documents

When I worked on the Herald editing the tech section, such a bundle turned up one day and produced a front page story. Unfortunately the disillusioned company man who sent it was silly enough to have printed out the documents from his own work computer.

Within hours of the story being published, representatives of the company were at his front door asking for his swipe card and informing him he didn’t have a future with the company.

These days, whistleblowers have a more direct outlet for leaking information – the internet. And for three years,a controversial website has been a public repository for sensitive documents leaked from government departments, military installations and corporations all over the world, uncovering corruption and revealing dubious activity the public would otherwise have been none the wiser about.

In many respects, has come to fill the void left with the departure of investigative journalists from newsrooms. Source documents don’t lie and Wikileaks posts them unedited and without commentary, allowing readers to make their own judgements about the contents of them.

Anyone with a PDF swiped off the company network and a beef, or a good reason for blowing the whistle could post a document to a blog set up anonymously. Eventually it would find an audience, probably after being reposted and retweeted until a reporter finally found it – or was forwarded the link to it by someone.

But Wikileaks is the source of the Nile for leaked material, attracting millions of visitors and millions of submissions of documents. As such, its current problems stem directly from its success. The little website that shook the world has collapsed under its own weight, out of money and with not enough manpower to wade through the ever-increasing slush pile of submitted documents.

Who knows, there could be the next Watergate scoop sitting in Wikileak’s inbox, but according to Aussie Julian Assange, who set up the website in 2006, the site is out of money and due to its policy of not accepting funding from companies, governments, activitsts and lobby groups, will only survive if the public comes forward with donations.

There isn’t much New Zealand content on Wikileaks that shakes the boat too much. Last year, a commercial agreement between Vodafone New Zealand and new mobile entrant 2Degrees was put on Wikileaks after the National Business Review was ordered by the Commerce Commissions to remove details of it from its website.

The Australian blacklist for Government web filtering was posted to Wikileaks, which was incredibly embarrassing for the Australian Government as it revealed numerous websites that were neither explicit nor offensive.

Wikileaks could have become the place where name suppression orders are flouted, laws broken. But the site has a distinct public interest focus. Occasionally it has crossed the line in my opinion, but usually it has a good reason for featuring the documents it posts.

The tricky, time-consuming and costly part of what Wikileaks does is verifying the authenticity of the documents it is sent. It has a network of people around the world checking to make sure documents are legit. Surely a band of keen journalism students would be willing to assist in the extensive legwork required in this task. What better training in investigative journalism could you ask for?

This New Scientist article from a couple of years ago gives some information about how Wikileaks manages to go about its business incognito and protect the identity of submitters, while this Wikipedia entry explains how it avoids being taken down by lawsuits or hacking attacks:

Wikileaks describes itself as “an uncensorable system for untraceable mass document leaking”. Wikileaks is hosted by PRQ, a Sweden-based company providing “highly secure, no-questions-asked hosting services”. PRQ is said to have “almost no information about its clientele and maintains few if any of its own logs”. PRQ is owned by Gottfrid Svartholm and Fredrik Neij who, through their involvement in The Pirate Bay, have significant experience in withstanding legal challenges from authorities. Being hosted by PRQ makes it difficult to take Wikileaks offline. Furthermore, “Wikileaks maintains its own servers at undisclosed locations, keeps no logs and uses military-grade encryption to protect sources and other confidential information.” Such arrangements have been called “bulletproof hosting“.

As of this evening the Wikileaks website isn’t even online, perhaps because it has been deluged with hits from people dismayed at the thought of it disappearing. I for one will be donating to Wikileaks, if I’m given the opportunity. The site has provided me with hours of fascinating reading, not so much with the high profile documents like the Sarah Palin leaked emailsor the Guantanamo operating manual, but confidential documents that show the inner workings of government and big corporates, revealing the language they speak when sensitive matters are being discussed.

Here’s hoping Wikileaks gets an injection of funds and survives – we need it more than ever.

Final chapter in the MMR-autism scandal Peter Griffin Feb 01


In the world of medical research, Dr Andrew Wakefield is about as controversial a character as you can find.

Andrew Wakefield

Andrew Wakefield

It was Wakefield who was behind claims published in The Lancet in 1998 that pointed towards the measles, mumps and rubella vaccine being linked to autism in children. His paper and subsequent statements to the media that parents be wary of the vaccine, led to a slump in vaccination levels in the United Kingdom and around the world creating global distrust of vaccination in general.

In that respect, Wakefield has done untold damage, because his MMR-autism claims have been thoroughly discredited in numerous studies. Last week, the final nail was driven into the coffin that is Andrew Wakefield’s medical career, with the UK’s General Medical Council finding Wakefield and two colleagues guilty of a range of serious breaches in a failure to practice case. You can read the 142 page ruling here while scientists asked to comment on the ruling by the Science Media Centre in London give their take here.

Wakefield gave an interview to the Telegraph following the ruling.

As Brian Deer, the investigative reporter who has over the last decade untangled the real story around Andrew Wakefield, writes in the Times:

The panel’s findings were astounding, both in their number and substance. More than 30 charges were found proven against Wakefield. For him alone they ran across 52 pages. Embracing four counts of dishonesty – including money, research and public statements – they painted a picture of a man not to be trusted. Uptake of the MMR vaccine dropped to under 80 per cent nationally and in some places fewer than 50 per cent of children had the recommended two doses. Cases of measles rose and Britain saw its first death from the disease for 14 years. Mumps reached epidemic levels in Britain in 2005.

The ruling against Wakefield is professionally devastating for him. As Richard Smith, former editor of the British Medical Journal told Deer:

’Any journal to which a researcher shown to be dishonest submitted a paper would reject it. They would say, ‘This man can’t be trusted’. His career as a researcher is effectively over.’

Deer’s investigations laid out on his website are well worth a read if you are interested in the ethical and professional breaches Wakefield has been judged on by the council. As for the doctor himself, he remains defiant as this rare NBC interview from late last year illustrates.

What is the legacy of the MMR-autism scandal for New Zealand? Well, New Zealand has low rates of immunisation relative to the rest of the Western world. Last year, measles cases were reported at ten times the usual number in New Zealand, because parents opted out of vaccination programmes. Part of that is down to difficulty in accessing immunisation in parts of the community, but a portion of it is down to the lingering paranoia around the MMR vaccine that Wakefield’s research caused.

At the Science Media Centre last year as measles cases were being reported all over the country, we asked Dr Samuel Katz, the co-inventor of the measles vaccine about the supposed link between MMR and austism. He told us:

’Measles vaccine is both safe and effective. Repeated studies in Europe and North America have totally disproved the claim that MMR led to autism and bowel disturbances. As with any biological item, there are unusual adverse reactions (allergic, hematologic, neurologic) but these are extraordinarily rare.’

The GMC investigation didn’t look at the scientific substance of Wakefield’s claims about autism – they have been pulled apart in peer-reviewed research as Dr Katz explains above. What the council nailed Wakefield on was his shoddy and dishonest approach to research and as such, he is no scapegoat.

But as Ben Goldacre points out in this Guardian column, Wakefield’s research would never have created the firestorm of media coverage it did if the media had looked at the methodology of the research:

The MMR scare has now petered out. It would be nice if we could say this was because the media had learned their lessons and recognised the importance of scientific evidence, rather than one bloke’s hunch.

Instead it has terminated because of the unethical behaviour of one man, Andrew Wakefield, which undermined the emotional narrative of their story. The media have developed no insight into their own role — and for this reason there will be another MMR.

Network-wide options by YD - Freelance Wordpress Developer