SciBlogs

Archive March 2012

‘scientists anonymous (nz)’ write again… Alison Campbell Mar 26

2 Comments

I’ve written about the group who call themselves ‘Scientists Anonymous (NZ)’ before, in the context of determining the reliability of sources. At the time, I commented that I would have a little more confidence about the information this group was putting out there if the people involved were actually identified – as it is, they are simply asking us to accept an argument from (anonymous) authoriry. (I was rather surprised to actually receive a response to that post, albeit its authors remained anonymous.) Anyway, this popped up in my inbox the other day, and was subsequently sent to me by several colleagues in secondary schools:

TO: Faculty Head of Science / Head of Biology Department

Please find a link to the critically acclaimed resource (http://programmingoflife.com/watch-the-video) dealing with the nature of science across disciplines/strands.

Interesting to see an attempt to link it into the current NZ Science curriculum with its focus on teaching the nature of science.

PROGRAMMING OF LIFE

  • The reality of computer hardware and software in life
  • The probabilities of a self-replicating cell and a properly folded protein
  • Low probability and operational impossibility
  • The need for choice contingency of functional information

Freely share this resource with the teaching staff in your faculty/department.

Yours sincerely

Scientists Anonymous (NZ)

So, I have been to the website. I intend to watch the video tonight (from a comfy chair), but the website itself raises enough concerns, so I’ll look at some of them briefly here. And I’ll also comment – if they really are ‘doing science’, then it’s not going to be enough to simply produce a list of ‘examples’ of the supposed work of a design entity (because that’s what all the computing imagery is intended to convey) & say, see, evolution’s wrong. That would be an example of a false dichotomy, & not scientific at all. They also need to provide an explanation of how their version of reality might come to be.

Its blurb describes the video as follows:

Programming of Life is a 45-minute documentary created to engage our scientific community in order to encourage forward thinking. It looks into scientific theories “scientifically”. It examines the heavy weight [sic] theory of origins, the chemical and biological theory of evolution, and asks the extremely difficult questions in order to reveal undirected natural process for what it is – a hindrance to true science.

The words ‘undirected natural process’ immediately suggest that this is a resource intended to promote a creationist world-view. I would also ask: if the documentary is created to ‘engage our scientific community’, then why did Scientists Anonymous send it to secondary school teachers in biology and not to universities & CRIs across the country? The blurb goes on:

This video and the book it was inspired by (Programming of Life) is about science and it is our hope that it will be evaluated based on scientific principals [sic] and not philosophical beliefs.

Unfortunate, then, that they wear their own philosophical beliefs so clearly: ‘undirected natural process’ as a ‘hindrance to true science’.

As well as linking to the trailer for the video, & the full video itself, the Programming for Life website also presents a bunch of ‘tasters’. One of these is the now rather hoary example of the bacterial flagellum (irreducible complextiy, anyone?) The website describes ‘the’** flagellum thusly:

The bacterial flagellum is a motor-propeller organelle, “a microscopic rotary engine that contains parts known from human technology such as a rotor, a stator, a propellor, a u-joint and an engine yet it functions at a level of complexity that dwarfs any motor that we could produce today. Some scientists view the bacterial flagellum as one of the best known examples of an irreducibly complex system. This is a single system composed of several well-matched, interacting parts manufactured from over 40 proteins that contribute to basic function, where the removal of any one of those parts causes the entire system to fail.

** As noted on my link for this example, there is no such thing as “the” bacterial flagellum as the sole means of bacterial locomotion: different prokaryotes get around in different ways. Nor is the flagellum a case of design; its evolutionary history has been quite well explained. The lack of quote closure (& of citation) is in the original.

Mitochondria have their own executable DNA programs built in to accomplish their tasks.

Well, yes, & no. Several key mitochondrial genes are actually found in the cell’s nucleus – something that allows the cell to control some aspects of mitochondrial functioning (& incidentally prevents the mitochondria from leaving!). There’s a good review article here. That the number of nuclear-based mitochondrial genes differs between taxa is a good argument for evolution; for design – not so much.

Much like the firewall software on your computer the membrane contains protein gate keepers allowing only those components into the cell that belong and rejects all other components. The membrane is thinner than a spider’s web and must function precisely or the cell will die.

Well, d’oh – except when it doesn’t. Viruses, and poisons that interrupt cellular metabolism, get in just fine. They really are pushing the boundary with this computer metaphor.

The human eye is presented as an amazingly complex ‘machine’ – yet we have a good explanation for how that complexity evolved. And more telling (but omitted from this presentation): the eye’s structure isn’t perfect – it’s a good demonstration of how evolution works with what’s available,but hardly an argument for the wonders of directed design. The same can be said for the human skeleton, which is also in the taster selection, along with the nucleus, DNA, & ribosomes (which come with more, lots more, of the computer software imagery).

As I said earlier, if this video is not simply another example of the use of false dichotomy to ‘disprove’ a point of view with which its authors disagree, it had better provide more than metaphor. That is, I’ll be looking for a strong, evidence-based, cohesive, mechanism by which these various complex features sprang into being. Otherwise, we’re not really talking ‘nature of science’ at all.

_______________________________________________________________________________

I was going to stop there (for now) but then I noticed the ‘Investigate the facts’ heading. It links to a list of various papers & articles that supposedly support the ‘design’ hypothesis. Richard Dawkins’ name caught my eye – he’s there for writing that

Human DNA is like a computer program, but far, far more advanced than any software we’ve ever created.

I had a couple of thoughts; a) metaphor is a wonderful thing, & b) Dawkins is a biologist & science communicator, but not necessarily big on programming. (If I am inadvertently doing him a disservice, I apologise!). Someone else had the same thoughts.

tertiary teachers & accreditation Alison Campbell Mar 21

2 Comments

This is something I wrote for Talking Teaching. It doesn’t have a strong biology focus, so I hope my ‘regulars’ will forgive me :-). but I’d like to generate some discussion around this issue.

Over the years I’ve had a fair number of conversations with my students about what’s involved in being a university lecturer. They ask things like how I decide what to teach, how we develop programs, and – this year – just what I do when I’m not in front of a class. (They genuinely thought that I’m ‘on holiday’ when the teaching semester’s over: I found this rather sweet *smile*.)

And someone will always ask, do university lecturers have any training in how to teach? After all, these days primary, secondary & pre-school teachers are all required to have professional qualifications in education.

The answer is, it depends. (I’m going to talk about university lecturers here as that’s the area I’m familiar with.)

Back in the ‘old days’ (ie when I was a student, lol) you probably would have been scratching to find any university lecturer who had a teaching qualification alongside their discipline-based qualification. (Back then, Colleges of Education were generally not part of the university system here in NZ.) These days, universities have some form of professional qualification available for their staff to study for, but it’s purely a voluntary decision to take it up. It’s probably fair to say that a significant majority of university lecturers still do not have formal training in education.

The obvious question is, does it matter? After all, generations of lecturers have learned the necessary skills ‘on the job’, and generations of students have completed their degrees or diplomas & gone on to graduate.

Yes. Yes, it does matter. Let’s have a look at the meaning of the term ‘accreditation’ (Ingarson et al, 2006):

‘Accreditation’, as used in this report, refers to an endorsement by an independent external agency that a professional preparation course is adequate for the purpose of a particular profession; that the course is able to produce graduates who meet standards for entry to the profession and are competent to begin practice.

..Accreditation is also an important mechanism for engaging members of a profession in decisions about standards expected of those entering their profession, as well as standards expected of preparation courses.

In the context of this post, ‘accreditation’ would refer to confirmation that someone had been through a program of study that adequately prepared them to teach a class. In a teaching context, that program would include exposure not just to good teaching practices, but also to the professional literature around teaching in a particular discipline. And this matters a lot, because as I’ve said elsewhere on Talking Teaching, there’s so much more to teaching than simply transmitting information – the method which very many lecturers would have picked up, because that’s how they themselves were taught. (Certainly that was my experience, back in the day, & it’s one that my friend & colleague Kevin Gould described to great effect in a recent presentation on good use of teaching technology.)

In other words, university teaching is a profession (after all, I’ll bet many of us put ‘lecturer’ on census forms & the like!), and there’s a good case to be made to support academics’ ongoing professional development in education and to recognise that through form of accreditation. As Hicks, Smigiel, Wilson & Luzeckyi (2010) note, such professional development can

[promote] a set of shared expectations and understandings about the nature of university learning and teaching

which would help to promote consistency in approaches across the institution and also the sector and, because staff are gaining an enhanced understanding of just how students learn, enhanced learning outcomes for students. Note that consistency =/= homogeneity! But rather, academics at the various institutions would have (Hicks et al, 2010)

some common understanding of core learning and teaching principles.

This sort of professional development, leading to accreditation, should probably be focused on new lecturers to begin with, as they’re arguably those who really, really need such support. After all, as Kevin pointed out in his talk, if you’re thrown in the deep end & simply emulate the practices of those who taught you, you’re likely to pick up some pretty bad habits along with the good, & over time these can become deeply entrenched. (Which does suggest that it would be good, at some point, to involve experience lecturers in the conversation around best-practice in teaching and learning as well.) And you could also ask, why should both new teachers and their students struggle while the teachers find their feet? That’s not good for anyone.

The other thing is, universities have changed from the days when I was a student, & they’ll continue to change. Along with technological advances (which as Kevin said, have been embraced in very many secondary schools, to the point where students view teaching technology as the norm & may well expect to see it used in similar ways in university classrooms) and increasing numbers of ever more diverse students attending university (with ever more diverse experiences and needs), there’s also

an expectation that universities should be more accountable to funding bodies and other stakeholders (students, parents, employers, etc.) (Hicks et al, 2010).

One way to respond to this is for institutions to be able to demonstrate that their staff have that “common understanding of core learning and teaching principles” and are able to apply these in their classrooms for the good of their students’ learning.

And what’s the best way to show this? Through some form of accreditation.

(Of course, for all this to happen we do need a definite change in the culture of universities. Staff are probably not that likely to want to participate in professional development if they perceive that teaching is accorded less value than research when it comes to promotion, or when they perceive that such programs are’t valued by their colleagues – or when models for workload allocation don’t take into account staff involvement in these programs. But there’s nothing to be lost by talking about and working towards that ideal.)

M.Hicks, H.Smiegiel, G.Wilson & A.Luzeckji (2010) Preparing academics to teach in higher education. Australian Learning & Teaching Council. http://www.flindrs.edu.au/pathe/

L.Ingvarson, A.Elliott, E.Kleinhenz & P.McKenzie (2006) Teacher education accreditation: a review of national and international trends and practices. pub. Teaching Australia. ISBN 0-9775252-6-0

parasite goes bananas before s*x Alison Campbell Mar 16

No Comments

That got your attention, didn’t it? It certainly got mine when I was scanning the Science alert news page a wee while ago. The parasite in question is Plasmodium, the single-celled organism that causes malaria. (I’ve written about Plasmodium before as it has a rather interesting evolutionary history.) And the research in question was published in the Journal of Cell Science – annoyingly, my institution’s subscription excludes the most recent six months’ worth of papers, so I could only read the Science alert release.

It’s an interesting story. Like the other members of its genus, Plasmodium falciparum (which causes the most severe, potentially – & frequently – lethal form of malaria) has a complex life cycle. A mosquito that bites an infected human host will probably pick up P.falciparum in the blood it ingests, & can then transmit the pathogen to the next person it bites. Once in a new host, the malaria parasite reproduces asexually & goes through a number of life-cycle stages as it infects first cells in the host’s liver & later the host’s red blood cells. As the red blood cells swell with growing numbers of the parasite, they also accumulate a range of waste products produced by Plasmodium. Eventually the cells rupture & release both Plasmodium cells (all ready to infect more red blood cells) & those cells’ wastes into the host’s bloodstream, & this is what causes the physical symptoms of malaria.

Eventually the parasite metamorphoses into its reproductive phase – a phase that has the banana shape mentioned above. Strange though it may sound, apparently the crescent-like shape of these sexually-ready parasite cells is essential for their survival. Once outside the red blood cells the parasites are potentially exposed to the host’s immune system & can be targeted for destruction, but the banana shape seems to allow at least some to escape & survive long enough to be sucked up by another mosquito. (The actual plasmodial hanky-panky occurs in the mosquito’s gut.)

The Melbourne University research that’s described by Science alert has found when Plasmodium‘s ready for s*x a particular set of proteins forms a banana-shaped scaffold underneath it’s cell membane. This is interesting of itself, as it’s always nice to understand the mechanism by which something happens. But it’s made the research team rather excited, because identifying the proteins involves raises the prospect of targeting them – using a drug or perhaps a vaccine – & disrupting formation of the banana-shaped scaffold.

Which would pretty much put a dampener on any further prospects of hanky-panky, disrupting the parasite’s life cycle & so preventing the transmission of malaria. Great stuff!

marathon man, part II (another replay) Alison Campbell Mar 12

6 Comments

Since I (re)posted the first part of this story last week, I figure I’d better complete the tale today :-) Hopefully things will settle down a bit at work now the semester’s under way, & I can get back into some ‘proper’ writing!

Possession of an Achilles tendon is only one of the things that sets humans up for endurance running. Bramble & Lieberman (2004) note that long-distance running requires a whole suite of adaptations for skeletal strength, stabilisation, thermoregulation, and energetics. I’ll summarise some of their comments here.

Skeletal strength: running places much greater stresses on the skeleton than walking does. One adaptation that reduces these stresses is an increase in the area of joint surfaces. Compared to both chimps and australopithecines, Homo skeletons have much larger joint areas in the hip joint, knee, pelvis, and lumbar vertebrae. This suggests that australopiths walked, but were not regular runners, unlike species of Homo.

Stabilisation: when someone is running, their posture is less stable than when they are walking – they tend to lean forward and the pumping action of their legs means that the torso sways from side to side. The upper body is stabilised by several features, including large spinal muscles that are anchored to the pelvis, and the very large gluteus maximus muscle. This muscle is actively involved in running, less so in walking. In addition, added stability comes from vigorously swinging the arms and thorax in opposition to the swing of legs and hips. This is made possible by a narrow, flexible waist – a feature that is fully developed in H. erectus but not the australopiths – & free-swinging shoulders.

Thermoregulation: any organism that is physically active in a hot environment will risk heat stress and must have ways of losing heat. Prolonged running places much greater demands on the body’s cooling systems than walking does.

In modern humans, sweat glands permit evaporative cooling, and the reduction in body hair means that heat is more efficiently removed through convection. Obviously these don’t fossilise & so we can’t be certain of when they evolved.  However, other features are also important in thermoregulation. One is the body’s surface area relative to body mass – it is greater in tall, thin bodies (eg erectus, but not Australopithecus). Another is mouth breathing. This allows higher rates of airflow with less muscular effort than nose breathing, and also functions in shedding body heat. Modern runners are mouth breathers; the great apes are not.

Energy costs & demands: as you’ll remember from the previous post, the Achilles tendon contributes to the energy efficiency of running by acting as a spring, reducing the metabolic demands on muscles. Skeletal evidence from australopith leg bones suggests that these hominins lacked an Achilles tendon and so are less likely to have been good distance runners. The arch of a human foot also acts as a spring, again contributing to energy efficiency.

Stride length is also important. Running humans increase speed by increasing the length of their stride, something which is related to the presence of ‘springs’ in the legs & also to having relatively long legs. Homo erectus definitely had long legs, relative to body mass – perhaps 50% longer than in Australopithecus afarensis. Longer legs do impose an energy cost, however, as they are swung to & fro. This can be reduced by having most of the leg’s mass closer to the hip than the ankle (eg through reduction in the size of the foot).

Overall, it’s reasonable to hypothesise that endurance running is a feature that evolved with the genus Homo, and probably in erectus. However, we need more information in order to test that hypothesis eg foot remains from erectus & more post-cranial material from habilis.

Reference: D.M. Bramble & D.E.. Lieberman (2004) Endurance running and the evolution of Homo. Nature 432: 345-352

marathon man (rpt) Alison Campbell Mar 08

No Comments

I’ve been blogging since August 2007. Which seems quite a long time, looking back on it :-) Anyway, because I’m kind of rushed at the moment – & on the theory that new(ish) readers might not have delved all that far into the back issues, I thought I’d repost a couple of pieces from way back then, just to keep you going.

I was looking through the SciTech Daily website (a good place to go for new reading in a whole range of science areas) when I saw the link to an article on the evolution of running in Homo. Followed it, read the article – & thought, this is really interesting.

The article describes research on the efficiency of walking and running in humans. It notes that the Achilles tendon linking calf muscles to the heel is essential for energy-efficient running. Chimps and gorillas don’t have this long tendon, and the research team comment that it would be very interesting to know at what point in our evolutionary history the Achilles tendon evolved:

But if, as seems likely, early humans lacked an Achilles tendon then whilst their ability to walk would be largely unaffected our work suggests running effectiveness would be greatly reduced, with top speeds halved and energy costs more than doubled.

Efficient running would have been essential to allow our ancestors to move from a largely herbivorous diet to the much more familiar hunting activities associated with later humans. What we need to discover now is when in our evolution did we develop an Achilles tendon as knowing this will help unravel the mystery of our origins.

No Achilles tendon in great apes? Another search took me to a blog entry by PZ [sadly the link in the original post no longer takes you to his "Marathon Man" post], which included this image of human & chimp leg anatomy (the original is from a paper in Nature). You’ll see that in humans, the Achilles tendon links the calf muscle to the tarsal bone in our heel. But in chimps, the muscle extends right down to the tarsal. Why is this difference significant? To find out, I read the Nature article (reference at the end of this post).

In humans, the Achilles tendon acts as a shock absorber and also stores energy. It stretches as your foot comes down (the braking phase), gaining elastic potential energy. Then, it releases that energy through recoil as you push off from the ground again. That is, the tendon is acting like a spring – one that can save up to 50% of the metabolic cost of running through reducing the work done by the muscles themselves. This makes Homo quite good at endurance running, something that would be essential for active hunting out on the savannah.

It’s not just the Achilles tendon, of course. There’s a whole suite of adaptations that may be related to running as a means of getting around. The question is, did these adaptations arise during the evolution of bipedal walking, or are they specifically related to running? I’ll summarise that discussion in the next post.

Reference: D.M. Bramble & D.E. Lieberman (2004) Endurance running and the evolution of Homo Nature 432: 345-352

changing teaching techniques Alison Campbell Mar 05

No Comments

This post’s title is another one drawn from the search terms that brought people to my ‘other’ blog at Talking Teaching :-)

I’ve written quite a lot about the benefits students may gain as a result of lecturers changing the techniques they use in the classroom. A while back I wrote about the idea of helping students to visualise a paper’s curriculum, & this semester I decided to try that out with my first-year biology class. Today was the first day of the new semester, & I thought I’d share what I did with them — it would be interesting to hear what others think of this approach, so please do add a comment :-)

I kicked off with this slide — I thought the images captured some of the confusion that many first-year students seem to share as they enter their first year of uni study. It’s a fair bet that all the new terms & concepts thrown at them in many ‘traditional’ paper outlines don’t help :-)

Then I listed the obvious: the various classroom ‘styles’ they’ll be experiencing (ie lectures, labs & tuts). And pointed out that there are definite bi-directional links between them — this is because (in my experience, anyway) some students tend to see them as isolated enitities. When I first tried my hand at a diagram like this my wonderful friend & colleague Brydget pointed out that it was way too complicated; the kids would just get lost in the detail. I took her advice & had another go :-)

And then I asked, OK, when you enrolled in this paper, what did you think you’d be doing & learning? This was the very first class so I wasn’t sure what responses I’d get, if any, but I wanted to send the message from the start that this is how I teach & that active participation is the norm in my lectures. But people put their hands up. ‘Content,’ they said; ‘stuff about plants & animals & how they function & how they interact with their environment.’ ‘Great!’ I said, ‘and I need to make sure that we do look at some of this, because my colleagues further down the line will expect you to be familiar with this material.’

‘But wait!’ I said, ‘there’s more!’ (Because beyond ‘dissections!!!’ no-one had mentioned any process skills.)

So now we could look at those other skills & why they are relevant. We’d talked a bit about plagiarism at orientation last week, so I could check back on their understandings around this — & emphasise that we’ll be working with them to develop their skills in academic writing, referencing, citations & so on. And critical thinking — to me, this is surely one of the most important skills that any student could acquire during their time at university.

Now, where are we going with all this?

Well, there’s the obvious one — that first-year is expected to turn out students with the knowledge & skills that they’ll require if they’re going on to further study in the subject. But there’s a second, equally important point here, and it hinges on the fact that there are quite a few students in the class who aren’t going to major in biology, & who may not actually be science students at all — they’re taking the paper as an elective in another degree altogether. What do I hope they will gain from it?

Yes — apart from (I hope!) helping them gain an enthusiasm for & appreciation of the living world, I really really want to enhance the scientific literacy of all my students, so that they can apply this understanding in their own future lives, regardless of whether they’re going on to a career in the sciences.

Now, I don’t know what the class thought of this approach — yet. I’ve asked them to let me know (anonymously if they like) through our Moodle page. But it would be good to hear from readers as well :-)

ensuring student success Alison Campbell Mar 05

2 Comments

The other day my colleague Nigel Robertson (from the uni’s centre for e-learning, "WCeL") sent through a link to this article: Ensuring student success – students are not to blame. The writer, Arshad Ahmad, begins by saying that

[many] students may appear to be unqualified, unprepared and uninterested. But if you believe, as I do, that each one of them has a talent, each one of them has a capacity to develop – intellectually and emotionally – then it follows that each one should be given a fair chance to succeed.

 And he goes on to say that

[there] is an alarming scarcity of interdisciplinary courses, little integration of existing courses, and almost no alignment to achieve the specific outcomes that these collections of courses are geared towards.

This is especially true of first-year courses with large impersonal classes taught by teaching assistants and part-time instructors.

It is exactly during this time, the first-year experience, when students are making important transitions, when students require a lot of personal attention and when they seek faculty (ie staff] time. It is a time when we should put our best teachers on the front lines and offer an experience that few, if any, students will be able to refuse.

I couldn’t agree more, & it’s the reason we (my wonderful colleague Brydget, & I) keep reviewing the labs we offer to our first-year bio students. Lab classes being so much smaller than lecture streams, they represent an excellent opportunity to give students one-to-one attention. This year we’re trialling a peer mentoring system: another chance for students to form good working relationships with others in their class & work together to enhance their learning.

And it’s also why I keep banging on to people about the benefits of moving away from the traditional lecture format. (Don’t have to worry about the teaching assistants bit there as the lecturers are the people who front up to these classes.) 

Arshad’s article is a strong argument in support of the need for regular & thorough review of teaching programs – not just individual papers, but the actual degree programs themselves. Otherwise there is always the risk that the collection of papers, overall, can lack focus – and that is not going to produce the best learning outcomes for our students.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

At Enough of the Cat Talk, Darlena makes a similar point: "Would we demote an entire class of children for our inability to teach them?"

 

 

 


if meetings really lower iq… Alison Campbell Mar 01

No Comments

… then there’s little hope for the world :-)

I attend a lot of meetings; that’s the nature of my job. This morning the Dean came in & waved the front section of the NZ Herald under my nose. “Look,” he said, “all those meetings are really bad for you.” Scenting a way of getting out of them, I grabbed the paper & found the article in question (syndicated from the UK paper, The Telegraph).

Attending meetings lowers your IQ,” cried the headline, & the article goes on to say that

[the] performance of people in IQ tests after meetings is significantly lower than if they are left on their own, with women more likely to perform worse than men.

The story’s based on a press release about research carried out at Virginia Tech’s Carilion Institute. And this showed that the research outcomes were more nuanced and more complex than the newspaper story would have it.

Research led by scientists at the Virginia Tech Carilion Research Institute found that small-group dynamics — such as jury deliberations, collective bargaining sessions, and cocktail parties — can alter the expression of IQ in some susceptible people (Kishida et al. 2012).

In other words, meetings don’t necessarily lower your baseline IQ. What they may do is change how you express that IQ, particularly if you’re susceptible to peer pressure. The internal urge to conform can result in people making decisions as part of a group that they might not have made on their own, especially if they have concerns about their status in that group. (As the Virginia Tech release notes, this was shown to good effect in the superb film 12 Angry Men, with Henry Fonda leading a stellar cast.)

The researchers placed study participants in groups of 5 and studied their brain activity (using MRI scans) while the groups were engaged in various tasks. While the groups were working they were also given information about the intellectual status of group members, based on their relative performance on those cognitive tasks. (There’s a tendency for people to place great store on relative measures of IQ, & where they personally sit on the scale.) And afterwards, when participants were divided on their basis of their performance into high- & low-performing groups before their IQs were measured again, they were found to differ quite signficantly despite the fact that all participants had statistically similar baseline IQs when tested at the beginning of the study.

Our results suggest that individuals express diminished cognitive capacity in small groups, an effect that is exacerbated by perceived lower status within the group and correlated with specific neurobehavioural responses. The impact these reactions have on intergroup divisions and conflict resolution requires further investigation, but suggests that low-status groups may develop diminished capacity to mitigate conflict using non-violent means.

As I said, this is altogether more nuanced, more complex, & much more interesting than the news story that caught the boss’s eye. I suspect I’ll be attending meetings for a while yet.

K.T.Kishida, D.Yang, K.Hunter Quartz, S.R.Quartz & R.Montague (2012) Implicit signals in small group settings and their impact on the expression of cognitive capacity and associated brain responses. Phil.Trans.R.Soc.B 367(1589): 704-716. doi: 10.1098/rstb.2011.0267

Network-wide options by YD - Freelance Wordpress Developer