SciBlogs

Archive 2010

#SciFoo: That was worth the trip! Fabiana Kubke Aug 11

No Comments

I was lucky enough to be invited to SciFoo this year, which proved to be a wonderful experience. SciFoo is an unconference organised by O’Reilly media, Nature and Google. It brings together a group of sciencey people to talk about science, and I cannot describe the level of awesome that I experienced while I was there.

I went well-prepared: I had read the blogs of the attendees who blog, read their descriptions of themselves, contributed to the suggested sessions in the wiki, and showed up with a list of ‘must-meet’ people and ‘must attend’ sessions to make sure I made the most of it.

But (and I learned that this happens after having attended 2 KiwiFoos), I might as well not done any of that homework. Because, apart from a couple of exceptions, I never got a chance to talk to the people on my list. Nor did I end up going to any of the sessions I thought I would go to. Instead, I found myself being pulled to ‘other’ people and ‘other’ sessions. And I guess that is the beauty of it all. Meeting people and hearing interesting things that were not necessarily on my radar.

I started by attending two Lightning Talk sessions, moderated by Nat Torkington. Lightning talks are 5 minute presentations, which were great because it gave me the chance to hear about lots of different stuff and from very different people (which also explains why my original list ended up being useless). I was drawn to the third lightning talks session the next day. There I heard about the relationship between scientists and music from Eva Amsden, what we can learn about people by asking them how they played as children from Linda Stone, neuroscience and law from David Eagleman and many other mind tickling topics.

These are some of the other sessions I attended:

RuleCamp: Basically about rules to follow to do stuff. Carl Zimmer, one of the speakers, summarised the session in his blog, so I will send you there to read his notes (which are much better than mine!)

Brain Machine Interfaces: I seem to have a fetish with BMIs, and the work of Miguel Nicolelis in this area changed the way that I think about the brain. So I couldn’t miss this one (especially since Nicolelis was there too!). I will be writing a bit more about this at a later time, but it is totally worth it to read about his research in his page. Most of all, I was seriously impressed with not only how far BMIs have gone, but how this kind of research is making us think about the brain in a very different way.

Collaborative Science: This was fun, and I mean that in a literal way. Because among other things discussed, FoldIt came up. Yes, you can contribute to science by playing games. And in the process you end up being acknowledged as an author on a Nature paper.

Blogging in a network: This session was led by John Dupuis, Carl Zimmer, Eva Amsen and Jonah Lehrer. Eva has a wonderful summary of this on her blog, so again, I will send you there.

I went to many other interesting sessions and had amazing scattered chats with different people throughout SciFoo. It was great to see old friends and acquaintances, and make new connections. But one thing I learned at Kiwi Foo, is that as amazing as the few days of the event are, what is really more amazing is what happens ‘between’ Foos. There is a whole year ahead, and I can’t wait to see what comes out of it.

(I have to give a special thanks to Nat Torkington and Cat Allman, who I am sure had a hand in getting me there, and also to Eva Amsen for wonderful personal swag from The Node.)

[Open] Science Sunday — 18.07.10 Fabiana Kubke Jul 19

No Comments

Thursday saw a meeting on ‘Data Matters: Making the Most of Publicly-Funded Research Data ‘ organised by the Ministry of Research Science and Technology. The event was tweeted under the #ResearchDataMatters hash-tag on twitter, and I wrote my notes on my FriendFeed page.

by Illustir on Flickr

The day consisted of a number of topical talks (great all of them) and a couple of brainstorming sessions by the individual tables. Julian Carver, who moderated the event did a wonderful job keeping us busy while sticking to the time schedule. It was indeed a great day filled with new ideas, and more importantly, new solutions.

It was clear that the room was filled with the vibe that opening the research data was not only important but also the direction in which we should move. The arguments in favour of this happening are quite compelling, and New Zealand can look inwards and abroad to find support for that position. There were also great examples of what New Zealand is doing in that respect, and that is also encouraging.

The central emerging theme that I think emerged from the day was that the questions about sharing data has moved from the if to the how domain. And the how is not an easy issue to solve, and one that occupies the time and thoughts of many advocates of open data. I think that these issues can be grouped into three broad categories: Ethical, Cultural, and those related to archiving.

Ethical issues:

In a way these are the ones that are relatively simpler to solve and probably encompass a narrow area of research primarily associated with Health (or other human) data. One of the concerns that was raised was that ethical approval and consent around the gathering of health data is bounded to specific studies that limit the ‘use’ of the data. A second concern is associated with privacy. I see these as relatively minor, since there are protocols in place for privacy, and ‘use’ can be redefined in the consent forms.

Cultural issues:

Cultural issues in the scientific community are a slightly higher hurdle to overcome, because it requires two things: a ‘buy in’ from the research community and a (I think) rather profound behavioural change that makes data archiving the default. There are heaps of issues around this, and I will probably leave it at that and come back to it on another post.

Archiving issues:

There was a general consensus that data should be shared. As Penny Carnaby said, if we invest in something because we think it is important, then we should also be thinking on how to preserve that knowledge. Or, what is the point of creating stuff if you then go ahead and delete it?

I also had the feeling that there was a general consensus that ultimately, it is not just about putting data on the web. Data is only useful if it can be discovered and as useful as how easily it can be re-purposed. But making data available in a meaningful reusable way is hard to do. Here is where my brain explodes, and where most of the talking centred around on the day.

There were a few things, however, that stuck with me and kept floating in my head as I took my flight back to Auckland.

One was a suggestion brought up by Andrew Treolar from the Australian National Data Service, about the need to make the data a ‘primary object’. Us researchers tend to think of the ‘paper’ as the end product, but he suggests this is a hierarchy that should also apply to the data. He suggests that data sets should be given a DOI, in the same way that manuscripts have, and this has several advantages. Not only does the data itself become a primary object, but the mechanisms to linking relationships between DOIs are already in place to create relationship and track citations between objects. DOIs have a further advantage and that is that attribution to the original source is inevitable. This idea solves at least in the interim some complex issues around data sharing.

A second point also brought up by Andrew Treolar, is that open data will probably be used to answer questions that are different to those for which the data was generated. This means that we researchers need to think of the description of the data beyond its original intention to facilitate re-purposing. And this is difficult, because how can I know what details will be needed when the question has yet to be posed? The minimum requirement would be to ensure that the data is properly described at least in terms of its origins and the steps through which it was obtained.

One of the things I also really took back with me was Penny Carnaby’s description of the work that the National Library of New Zealand has been doing around archiving of digital objects. She described the work done for the National Digital Heritage Archive (you can read about it here and here). The way I understand it, this system could provide viable solutions to some of the issues surrounding data archiving.

There is obviously a lot of work to be done, but it was encouraging to be in a room filled with people willing to be honest about the challenges, yet still enthusiastic about the road ahead. I will be interested in hearing what the follow-ups of Thursday’s meeting are, in particular the position of the funding bodies that were present in the room.

Megathanks to Jonathan Hunt and Julian Carver who made it possible for me to be there

For fireflies, getting the girl requires team work Fabiana Kubke Jul 10

No Comments

ResearchBlogging.orgImagine you drive into a motel in Gatlinburg TN, and see behind an open room door 2 guys setting up cameras pointing at the beds while two young women peek from the parking lot. Well, if it was in the mid ’90′s it might have been Drs Moiseff and Copeland setting up the equipment before venturing into Elkmont in the Smoky Mountains to study the local fireflies. (And one of the two women would have been me.)

Andy Moiseff and Jon Copeland started studying the population of fireflies in the Smoky Mountains National Park after learning from Lynn Faust, who had grown up in the area, that they produced their flashes in a synchronous pattern.

Image courtesy of Andy Moiseff

In the species they are studying (Photinus carolinus) the males produce a series or bursts of rhythmic flashes that are followed by a ‘quiet period’. But what is particularly interesting about this species is that nearby males do this in synchrony with each other. If you stand in the dark forest, what you see is groups of lightning bugs beating their lights together in the dark night pumping light into the forest in one of nature’s most beautiful displays.

Females flash in a slightly different manner and, as far as I know, they don’t do it synchronously either with other females nor with the males. One interesting thing in Elkmont is that there are several species of fireflies, and you can pretty much tell them apart by their flashing patterns. But as useful as this is for us biologists (since it avoids having to go through extensive testing for species determination), the question still remained of whether the flashing patterns played a biological role.

And this is what Moiseff and Copeland addressed in their latest study published in Science. They put females in a room where LEDs controlled by a computer simulated individual male fireflies. The LEDs were made to flash with different degrees of synchronisation and they looked at the responses of the females. They found that while the females responded to synchronous flashes of the LEDs, they really didn’t seem to respond when the flashes were not synchronous. Even more, they responded better to many LEDs but not much to a single one. What this means, is that if you are a male of Photinus carolinus, you better play nice with your mates if you want to get the girl.

What *I* want to know is how this behaviour is wired in the brain. At first hand, this seems like a rather complex behaviour, but in essence all that it seems to require is a series of if/then computations, which should not be too hard to build (at least not from an ‘electronic circuit’ point of view). But Bjoern Brembs reminded me of a basic concept in neuroscience: brains are evolved circuits, not engineered circuits. So, Andy and Jon, how *do* they do it?

Original article: Moiseff, A., & Copeland, J. (2010). Firefly Synchrony: A Behavioral Strategy to Minimize Visual Clutter Science, 329 (5988), 181-181 DOI: 10.1126/science.1190421

[Open] Science Sunday 27.06.10 Fabiana Kubke Jun 27

No Comments

It’s been a while since I wrote one of these posts, and a lot has been happening which is really great. But, as usual, exam marking took priority, or should I say, took over my life.

And the SPARC goes to….

First of all, I would love to congratulate the authors of the Panton Principles for receiving the SPARC innnovation award. Who is SPARC? According to their website

’SPARC has become a catalyst for change. Its pragmatic focus is to stimulate the emergence of new scholarly communication models that expand the dissemination of scholarly research and reduce financial pressures on libraries.’

According to the press release of June 22nd, the authors of the Panton Principles (Peter Murray-Rust, Cameron Neylon, Rufus Pollock and John Willbanks) were given this award because:

’The authors advocate making data freely available on the Internet for anyone to download, copy, analyze, reprocess, pass to software or use for any purpose without financial, legal or technical barriers. Through the Principles, the group aimed to develop clear language that explicitly defines how a scientist’s rights to his own data could be structured so others can freely reuse or build on it.’

There is a great article on the award and the history of the Panton Principles here. It is definitely worth a read.

(HT Jonathan Gray)

Open Science Summit

The ’First Ever Open Science Summit’ will be taking place in Berkeley California on July 29-31. It promises to be a great event not only because I am sure it will bring a lot of energy from the people attending a ’First’ but also because the session schedule just made me drool. From a retrospective of the human genome, to Open Access publishing, to Citizen Science, this looks like it will be a couple of days to remember by those able to attend.

(HT @JasonHoyt on Twitter)

Licencing Open Data

The Panton Principles address some of the issues surrounding how data should be share. Last week Glynn Moody on twitter pointed to this site: the Open Data Commons, a project ran by the Open Knowledge Foundation. This site provides 3 types of licences for data. I found the FAQ section quite informative, especially the linked section that discusses why these licences are put into place as opposed to the Creative Commons licences. (HT @glynmoody on twitter)

Related to this, there is a really interesting article on the Open Knowledge Foundation site that discusses the differences in non-commercial (NC) and share-alike (SA) licences, which addresses why the licences offered by the Open Data Commons are the way they are.

’This interoperability is absolutely key to realizing the main practical benefits of ’openness’ which is the ease of use and reuse – which, in turn, mean more and better stuff getting created and used.

[...]The aim is to ensure that any license which complies with the definition will be interoperable with any other such license meaning that data or content under the one license can be combined with data or content under the other license.

[...]Non-commercial provisions are not permitted because they fundamentally break the commons, not only through being incompatible with other licenses but because they overtly discriminate against particular types of users.’

Already the Panton Principles had suggested that licences other than CCZero of the Creative Commons should be discouraged. The Open Data Commons provides licencing formats for data and databases that should facilitate the way that data can be shared.

And that is a good thing.

How to build a [water] brain Fabiana Kubke Jun 18

No Comments

Whenever I try to teach some aspects of neuronal integration in class, I run into trouble, since most of the neuronal properties are defined by mathematical formulae that describe the electrical properties of neurons that are sometimes difficult for the students to grasp. Without a basic knowledge of electricity, it is hard to build a conceptual image of what neurons are doing.

Or is it?

I was invited to talk about the brain to a group of 9-11 year old pupils in a primary school in the North Shore yesterday, when I thought it might be fun to try to build neurons and discover how they worked. So, here is my water neuron:

The water neuron

It turns out, this little water neuron (which can be built with pretty much household items) has a lot to show about the passive properties of neurons.

Synaptic currents:

The pipette dropper was used to inject [current] water into the different dendrites. Because of the properties of the dropper, there is a limit to the amount of current that can be injected at a given time, and the injection of current is not instantaneous but has a time course that is analogout to the time course of the synaptic potential.

Spatial and temporal integration:

Current can be injected in one or more dendrites with different time patterns. Injecting into all dendrites at the same time, or into one or more dendrites at different time intervals provides a good idea of how the output of the neuron is shaped by spatial and temporal integration.

Threshold:

By tilting the ‘soma’ to different degrees the amount of current needed to be injected into the dendrites to allow for an output of the axon will increase. Therefore, one can build neurons with different thresholds and see how that affects the output of the neuron.

Leak currents:

One can poke tiny holes into the soma so that some of the current injected into the neuron leaks out. Combining this with changing threshold and the temporal patterns of injection into the dendrites is a good way of showing how temporal and spatial integration work in different ways to produce an output through the axon.  One can also put some leaks into the axon, and ‘myelinate’ it with saran wrap to show the insulating  properties of the myelin sheath.

Although this ‘water neuron model’ cannot illustrate the active properties of the neurons, it does contribute to an intuitive construct of how currents may be acting in individual neurons. The different neurons can be connected to form a circuit, and then one could examine how the output of the circuit is affected by changing things likethreshold, leak and number of inputs into individual neurons.

Well, it was fun. I may give this a go in my next neuro class at Uni.

About Paulo Freire Fabiana Kubke Jun 08

4 Comments

[Cross posted from Talking Teaching]

As part of the Postgraduate Certificate where I am a student, I was to give a 10 minute lecture on one theory of teaching. A list of ‘candidate’ theories were provided, and to my surprise Paulo Freire‘s ‘Pedagogy of the Opressed’ was in the list.

Well, that was quite a surprise.

I had first come across Paulo Freire’s orginal book about over twenty years ago, when I read it in the context of literacy programmes in Latin America. I would not have, then and now, predicted that his ideas would ever make it to a rather mainstream reading list. So, of course, I thought it would be fun to read him once again.

I don’t think I was aware how much I had internalised Freire, and how much of the way that I think about teaching is inspired by that original reading. It was indeed an interesting excercise. Especially because this time around I read his book while thinking how (or if) his ideas could be put in place in tertiary education given the real life limitations of the current tertiary system (like the large size of the classes).

In any case, this lecture also gave me the opportunity to give Prezi a go. First time user, but I love what can be done with it.

Freire’s philosophy is perhaps better defined for what it is not (it is not what he calls banking education). What it is, to me, is what is in this presentation. This presentation also has some thoughts about how I think his ideas could be applied to the current educational system.

It may make for a nice debate, so I thought why let all the work go to waste, right?

Well, here it is: http://prezi.com/3wsh5y4vtl4c/

[Open] Science Sunday — 18.04.10 Fabiana Kubke Apr 18

No Comments

Opening content by traditional Toll Access journals

At about the time that I had made the personal commitment to only contribute to Open Access publishing (as service on editorial boards or peer review) I was contacted by Georg Striedter asking me to join the Editorial Board of Brain Behavior and Evolution. After hearing from him his views about the new direction he was planning to take the journal, I could not refuse. Georg Striedter took the Editorial position for the journal starting this year and a big change ensued. The journal now has a new section called Highlights and Perspectives in Neuroscience, and articles in this section have been made free to access. I have been quite impressed with this section, and the quality of the discussions there. As an example, you can go and look at Mark Changizi’s piece ’Neuroscientis’s Embarassment: Artificial Intelligence’s Opportunity’ and Anat Barnea’s piece ’Wild Neurogenesis’. For the latter, I recommend reading first a great summary of the article around which the discussion centres which was posted by NeuroDojo when the article came out. Mark Changizi’s piece is self contained but you might also want to check this awesome discussion around brain size.

Many journals are opening up some of their content for free, and this is a good move. For example, on the 26th of March, Nature Publishing broke the news that the Nature News content free of charge. This is a great section and it is wonderful to have that content available to the general public.

Don’t mess with technology, what about BioTorrents?

On the 10th of April I participated in the Public Acta meeting in Wellington. One of the statements of the Wellington Declaration said that

’[Technological Protection Measures] should not infringe on or limit the rights of users to use or access copyright material in a manner that would be permitted without the TPM’

One of the arguments raised that day, is that technology is sometimes attacked when it is can be used to infringe copyrighted works, but that restricting such technologies may infringe on the ability of accessing material that is otherwise legally available.

This week saw the publication of a paper in PLoS One by Morgan GI Langille and Jonathan A Eisen. BioTorrents: A file sharing service for Scientific Data. As Tim O’Reilly said on his Tweet, this is a great use of the BitTorrent technology. Here is a technology that has valuable applications and should, too be protected as such.

One may ask what prompted me to attend the PublicACTA meeting. The answer is simple: Most scientific information is behind the copyright that as authors we often transfer to the journals where our work is published (for journals outside the Open Access model). Education, Health and Science rely heavily on having good access to this information. Any decision to regulate copyright will inevitably have an impact on Education, Health, Science and Technology. So ACTA cannot be framed around the protection of recording artists and the film industry and not consider its implications for these other areas of public good. The text of ACTA will be made public next week, at which time I hope scientists, educators and health professionals will collaborate in making sure the implications for their fields are taken into account.

Get your ACT(A) together Fabiana Kubke Apr 12

No Comments

The Anti Counterfeiting Trade Agreement (ACTA) is an international treaty that is being negotiated by a handful of nations, in a very secretive way. The text of the ACTA has not been officially released, but some of the issues under negotiation can be found through several leaked documents. However, public officials are unable to comment on such leaked documents.

Although its name may indicate that it is to deal with issues around counterfeiting, leaked documents have shown that the scope of the agreement may go well beyond that, to include other types of copyright and intellectual property infringements. And this raises two important questions:

  1. Why is an ‘antitrade agreement’ being negotiated by only a handful of countries to deal with issues that should fall under the umbrella of the World Intellectual Property Organization (WIPO) which has a greater representation of nations?
  2. Why the secrecy?

The next round of negotiations of ACTA is to be held in Wellington, New Zealand, this week. As a response, a conference known as PublicACTA was held on Saturday to draft a response to be presented to the ACTA negotiators in Wellington. The document now known as The Wellington Declaration can be found online and is linked to the petition signature page.

The document calls for transparency in the negotiation process and an opportunity for public participation, as well as a definition of the specific issues that are or should be associated with the ACTA itself.

Anyone that uses the internet will be either directly or indirectly affected by the ACTA resolutions. It is shocking to me that while on the one hand there is an increasing awareness of the need to open government and scientific data for public scrutiny, an agreement that will impact on the most ubiquitous medium for data sharing, open collaborations and discussions, is done behind closed doors and without significant public participation. If this is not a contradiction in terms, then I need to buy another dictionary.

  • Michael Geist’s blog is a great source of information.
  • The video of the Conference is now here.
  • The Wellington Declaration is here.
  • The petition signature page is here.

Thoughts on Hunter’s statement on Science, Climate Change and Integrity Fabiana Kubke Apr 09

No Comments

Professor Keith A Hunter has published a statment on ‘Science, Climate Change and Integrity‘ in the Royal Society of New Zealand website. His position with respect to the controversies surrounding climate change issues are made clear, as is his call for a re-examination of attitudes on both sides of the argument.

The controversies surrounding the science of climate change underscore the need for a more open approach in the reporting of scientific data, and Professor Hunter’s statement makes a strong argument towards moving in that direction. A while back Cameron Neylon [1] wrote in his blog in reference to the CRU email leaks that

[...] scandal has exposed the shambolic way that we deal with collecting, archiving, and making available both data and analysis in science, as well as the endemic issues around the hoarding of data by those who have collected it.

There are many arguments in favour of making scientific data openly available. In a recent commentary on Science, Jean-Claude Bradley is quoted as saying:

“It’s sort of going away from a culture of trust to one of proof,” Bradley says. “Everybody makes mistakes. And if you don’t expose your raw data, nobody will find your mistakes.”

Along those same lines Hunter argues that

’Science is a rational endeavour that is based on logical and critical analysis of scientific theories in the light of actual evidence. It follows that scientific information, including a transparent description of how the data has been processed and tested against hypotheses, must be publically available, especially when it has been publicly funded [...]’

Without this open approach, the validity of scientific information has to be entrusted to the peer review system, but even Professor Hunter echoes what are concerns of the scientific community at large, that

“while we place great faith in the peer review process to weed out ideas that are wrong, peer review is not perfect and can be abused by both sides.”

And further argues that:

’If the intensity of the personal attacks on climate scientists over recent months are to have any positive effect, it will be the adoption of a more transparent approach to the dissemination of information.’

Two sides of the coin: Public/Open access vs Open Data:

Although the issues surrounding open data and open access can be seen to sit under the same umbrella, they really deal with two slightly different issues regarding the dissemination of information. While public or open access to published data is now a requirement by many public funding agencies, and is important for the public dissemination of information,  it does not in itself solve issues such as those raised around climategate.

I am a strong supporter of Open Access publishing — whereby the public has access to the published information — but unfortunatley it shares some of the same shortcomings with toll access publishing when it comes to the review process: the process is not fail-proof. The Public Library of Science has to be commended for opening the post-publication discussion of the work they publish and making it possible to highlight both shortcomings and strengths in the published material, and in this way allowing to correct any errors associated with the peer-review process. Yet even within this open model the criticisms are raised about the published material itself, and this solution falls short of solving the kinds of issues raised by the opponents of climate change: The raw data is not published by default (although it can be requested and it is PLoS policy that it should be made available by the authors).

Open Data on the other hand makes the raw data available: the analysis can be checked, rechecked, rehashed and reanalysed by other people. And as Jonathan Eisen is quoted as saying in the Science article, people do find mistakes.

Opening the data allows those mistakes to be found and to be corrected, and that can only be good since the ultimate goal of science is not to defend one’s pet theory but to keep one’s mind open to find the answer that is most consistent with the data. As Lawrence Krauss said in his lecture:

’I would argue that the definition of open-mindedness is forcing our beliefs to conform to reality, and not the other way around.’

Opening the data will inevitably lead to agreed upon interpretations that conform with reality: It allows the conversation to centre around scientific facts rather than around personal attacks to the scientific community or to the groups of vocal skeptics. And, ultimately, finding the best answer is the one common ground that is shared by both groups.

So what next?

Hunter states that:

[...] it is only fair to expect the critics of the mainstream scientific views [...] to adopt an equally transparent approach with their own information, and with their own use and re-analysis of data entrusted to the public domain. They should also subject their findings to rigorous peer review. Opinion, however forthrightly expressed, will not change the laws of basic science.

As far as I know, there is still an Open Access Mandate to be had in New Zealand’s public research funding agencies. Let alone one on Open Data. So it was not without surprise that I read Hunter’s statment that

’In this regard, the Royal Society of New Zealand intends to play its part by developing a Code of Practice for Public Dissemination of Information that it hopes will assist the various New Zealand science organisations in improving their practices.’

This approach is needed if common ground is to be found between scientists and between scientist, society and policy makers. Disagreement is perhaps the strongest force that moves the interpretation of scientific data within the bounds of the most likely explanation. But disagreement can only move in a positive direction when all parties involved have equal access to information. In science that is called the data.

[1] Cameron Neylon’s blog has now moved here.

[2] Disclaimer: I receive and have received research funding from the Royal Society of New Zealand and am an Academic Editor of PLoS One.

On the perils of becoming a dinosaur Fabiana Kubke Apr 05

No Comments
[Crossposted from Talking Teaching]

As a student I always complained about the ‘dinosaur teachers’: those that had lost touch with the students and with the teaching material. Those whose attitude seemed to scream: ‘I cannot be bothered any more’.

Patricia Cranton says, in the context of why someone teaches:

’Another person may have defined himself as a teacher through having a vision of the role of teaching in society but may now, after many disillusioning years of practice, maintain his perspective of himself as a teacher because it is a social expectation or obligation from which he feels he cannot escape. ’

And that seems to sum up what a dinosaur teacher is. Teaching is neither foreign nor new to me, I have been teaching one way or another since 1982, and most of the women in my family were teachers of one sort or another.Yet I am not a teacher. I never received any formal training in teaching, and whatever I learned to do or to avoid, I did through trial and error. I am a scientist. I know how to do science. I received formal training, and though I (somehow) know how to navigate that world, it does not instantly qualify me as a good science teacher.

So after all these years, it was time for me to ask: have I become a dinosaur teacher? And if I have, can I do soemething about it?

I am now facing the challenge of replacing Colin Quilter in his teaching at the Medical School. These are not small shoes to fill, and it is a huge challenge. First, I am going back to teaching first and second year, which I have not done in a long time and which I consider much more difficult to do than higher level courses. It is not only the language but the size of the class. How do you engage with over a thousand students, especially when some are in an overflow room, which I cannot see? Colin was, to say the least, beloved by his students. If you do not believe me you can become a fan of a page called ‘Shrine to Colin Quilter’ on Facebook, or read his feature profile in Ako Aotearoa Academy of Tertiary Teaching Excellence. And facing a class knowing that the students expect a ‘Colin’ experience can be nothing less than terrifying.

But since I face fear as a scientist, I have decided to take a degree in education. For the next two years, I will become a student in tertiary education: I will sit in class, I will do homework and assignments, I will be assessed while I try to learn how to become a better teacher. I am not sure what to expect from the programme, but one thing is for certain: I will be in my student’s shoes again, shoes I vacated many years ago. And the programme, one way or another will make me sit down and think about issues around teaching in a more formal way. And that cannot be a bad thing.

Patricia Cranton (2001) Becoming an Authentic Teacher in Higher Education. Krieger Publishing Company, Malabar, Florida.

Network-wide options by YD - Freelance Wordpress Developer