Friday, 15 July 2016

School Management Information Systems

This is a subject that’s engaged me for many years. Unlike almost all the other posts in this blog this one isn’t inspired by a report or piece of research I’ve read. There seems to me to be several key issues in this area.

The first of these is the monopoly that Capita’s SIMS has over the schools’ MIS market. In 2013 SIMS was still used by 80% of schools in England and Wales (see here). This monopoly hasn’t been significantly affected by the appearance of competitor products, for example Progresso and Arbor. This monopoly and market domination is the primary cause of other key problems. Capita SIMS sits astride the schools’ MIS market without any competition significant enough to drive down their prices. As a result Capita SIMS costs are very high. Not only are their costs high for the schools using their software but they also charge other suppliers for the right to write to their data systems.  These costs suppress the development of innovative add-ons to SIMS.

Innovation is also suppressed because of the high costs of market entry and the difficulty in getting any market share when up against such a dominant competitor. School MIS systems are very complex. They need to include data fields relating to a wide range of student assessment items across phases and sectors (primary, secondary, special, independent and state). Data fields relating to parents and families, to staff and students are very numerous. Alongside these are the complexities of timetables and financial management. There are also behaviour data, attendance and systems to track students’ performance in these areas. Most of these need to be customisable to varying degrees to suit the local policies and systems of individual schools.

New competitors also have to cope with the rapidly changing demands of government regarding data reporting from schools. Teachers, Heads and MIS providers all suffer from the ever moving goalposts that the DfE have on wheels. Probably the recent change in Secretary of State will soon yield a new set of requirements. Every time that the DfE decides they need a new data item reporting, for example a phonics score for every year 2 student, suppliers of MIS systems have to update their product. This is an overhead that must be challenging to manage.

Right from the very beginnings of SIMS as an amateur development project the interface has always lagged behind the best software. The origins of the product were focused on providing schools with ways of collecting, holding and understanding the main datasets that related to school performance, and not to empowering teachers in the classroom with solutions that significantly reduced their workload and increased their efficiency. Some developments have done so. The introduction of e-registration (led by Bromcom originally I believe) made life much easier for teachers and administrators. Instead of having to trawl through paper registers each morning looking to see which students are absent it’s now pretty much automated process to identify missing students and message parents. But it has been painfully slow. Only in the last few years have tablet apps that allow teachers to quickly record behaviour incidents appeared. The potential to do so has been with us for at least 4 years on tablets and more like 10 years with laptops.

Local installations are still extraordinarily common for MIS systems. This makes it much more difficult for schools to ensure the availability of their MIS. Even small primary schools need local technical support expert enough to maintain the MIS server, update it and back it up. A number of years ago I went to a local primary who had lost their SIMS server to hardware failure. They had a local support contract but had never checked that included back up of the SIMS database. So not only had they lost the server but they also had lost all historical data.

The potential for MIS systems to transform the ways schools function is enormous, the surface has only be scratched. When data is in the Cloud there is potential for schools to learn from others in the same system.  How helpful would it be if your MIS was able to point you at other departments in other schools where students with a very similar profile (in terms of prior performance, social characteristics and attendance) were achieving better outcomes? Wouldn’t it be great if you could see what strategies had been successful before with a student who was misbehaving in your class? I’d like to see seating plans and behaviour systems showing you which students have the best chance of working without incident together. It would be very useful if teachers were able to track the effectiveness of homework.  If they could assign a category or tag to a homework, for example as extended writing or reflective writing and then look back over the term they could see which types had the best completion rates. They might be able to see what types of homeworks led to better results in end of module tests. Including lesson planning in an MIS would also allow tracking of results; in other words which categories of lessons had the highest or lowest rates of behaviour incidents, or best end of module test outcomes? Obviously this kind of data wouldn’t always provide clear answers but aggregated across departments it might lead to some very useful and well informed professional discussions. In the 15 years I spent as a teacher all these professional discussions were always based on hunches and anecdotes not data.

Teachers are the very best source of ideas about how to innovate MIS. But you have to ask them the right questions. If you ask them how their present MIS might be improved you’ll probably get some good ideas about enhancements to the interface, shortcuts that reduce the number of steps to complete a task and complaints about illogical nomenclature. But ask them what takes up inordinate amounts of their professional time or what they would really like to be able to do with data and with a good understanding of both teaching and data you could begin to develop some exciting innovations.

So how can this situation be unpicked? At the level of government policy intervention in the marketplace might be very helpful. With our present administration with its high opinion of the power of free markets this seems unlikely.

Competitors need to find ways to offer a very compelling alternative to Capita SIMS. Schools are generally pretty conservative (small ‘c’) so changing their MIS isn’t something they consider very frequently. The present pressure on school budgets might make this happen a little more as schools get around to looking very critically at all their areas of spending. A big barriers is the institutional costs of changing. Unless the new system is very intuitive and simple there will be very big costs both for staff training and lost productivity when staff need help remembering how to use this new system.  If Capita didn’t take a fee from systems building on theirs there might be ways to stealthily eat out SIMS from the inside.  You provide one compelling add-on after another until there is very little the school uses of the original SIMS system below your products. Then they are ripe to be transitioned completely away from SIMS. Another big problem for the kind of Cloud based opportunities I mentioned earlier is that they become more and more powerful as the user base increases. With several thousand schools there are very real benefits from being able to learn from other schools, but that isn’t true if you are the third to use it.

Perhaps a far sighted capitalist will see the opportunities here and invest in the development of a product that takes more than a decade to produce real returns? If so we could see some very exciting innovations in MIS functionality.

Flipping Alone isn’t the Answer

This is a piece of research that appeared in CBE—Life Sciences Education in July 2015 (available here). The study looked at the performance of students studying for a Biochemistry Major at the University of Massachusetts–Amherst. Over a five year period the researchers captured student performance in online homework activities as well as end of semester tests. The study looked at the performance of 489 students over the whole period. Of these 244 engaged in active learning in the face-to-face sessions, this meant using “personal-response hardware in class” or “student–student interactions facilitated by instructor” and “team-based, collaborative student interactions in class”. The chief conclusion of the paper is that a combination of flipped learning alongside active learning in class made a significant difference to outcomes in end of semester tests. As they say this approach “encourages students to become more engaged with course material, persist in their learning through more timely and accurate preparation, and, ultimately, perform better”. The effect is greater “for lower-GPA students and female students”.

Another interesting corollary to the research is the context of the study. “The initial impetus to convert the course described here from a standard lecture format to the flipped format was to keep class sizes from growing (due to increasing numbers of student majors) without substantially increasing the in-class time commitment of the instructor.” In other words as well as improving outcomes the approach reduced the face-to-face commitments of instructors. But this “increase in instructor efficiency is counterbalanced by the need for extensive development of online material on the part of the instructor, although that effort rapidly diminishes after the first offerings of the flipped course”. After a substantial initial investment in instructor time (and presumably some training for these staff) to create the online resources, less resources were then required to achieve better results. This study was in the United States and took place within a STEM course at HE and the numbers involved are relatively small. Allowing for these provisos this research should be prompting other HE providers to look at investigating the benefits of such an approach.

Monday, 13 June 2016

Research on London Challenge

Tony McAleavy and Alex Elwick, School Improvement in London: A Global Perspective, CfBT Education Trust, 2015

The fact that as this study says “London schools have improved dramatically since 2000” and that I ended my teaching career in London in July 2001 are surely not causally linked. Correlation and causation again. Although it is hard to resist feeling like I left just before the party really got going.

This study builds upon an earlier work from CfBT looking in detail at the causal factors underpinning the improvement in London schools between 2000 and 2012.

The study assumes that “what has changed is the internal effectiveness of the schools”. There is a quite cursory dismissal of the possibility of the changes in schools resulting from factors external to the school system. 19 lines without a single reference or footnote are devoted to an examination of this possibility. It seems a weakness in the research to begin by discounting one set of possible causes without examining any evidence. The authors may be right, they may be wrong to make this assertion, but this study does not help the disinterested reader examine the question.

Nicky Morgan (and probably Michael Gove) should they read this research will be delighted at how it confirms the direction of government educational policy after 2010. Look at the key factors identified as “enabling” the success:

  • “The power of data”, where “the growth in the use of education performance data and improved data literacy among education professionals” has been highly significant.
  • “The importance of professional development”, particularly because “training became increasingly the responsibility of practitioners rather than expert advisers who had left the classroom”.
  • “The contribution of educational leaders” they argue was significant not just because good leaders were recruited but “the most significant aspect of the London story was the emergence of the best headteachers as system leaders. … These outstanding headteachers were able to provide highly effective coaching support to other schools.” Just in case you were too dim to spot the reference they add that “the idea of Consultant Leaders has been adopted at national level” by the present government.
  • “The significance of sustained political support”, so that the strategies were given time and support. In fact they helpfully observe “Teach First and the academies programme continue to this day”.

The Statistical Improvements

I am not impressed by the authors citing both improved GCSE results and better Ofsted inspection outcomes as two separate and independent variables indicating progress. The former dictates the latter as any analysis of the data shows. The improvements in attainment of “high-poverty background students” is impressive (although it’s a shame this category isn’t defined). I would like to know if there have been changes in the proportion of these students within individual school populations or across inner or outer London as a whole. There is evidence that where students from deprived populations make up a small minority of a school they experience a smaller deficit in attainment. The authors are more impressed than I am by the changes to the gaps between disadvantaged attainment and other students, when London is compared to the rest of the UK. There have been improvements nationally and while the difference is smaller in London it is hard to know whether this is due to some extreme outliers in the UK wide data or not. As in some other parts of this report, the unwillingness of the authors to look seriously at possible criticisms or alternate views ultimately weakens rather than strengthens their argument.

iPad Research

I've been undertaking research into 1-1 iPad projects as part of my present role. In order to get some context I've been reading past research. There isn't a great deal of work yet done on iPads so the number of papers was pretty small. If there is anyone reading this who is aware of good research work please let me know. The following are my summaries of some of the more interesting pieces.

Rana M. Tamim, Eugene Borokhovski, David Pickup and Robert M. Bernard, Large-Scale, Government Supported Educational Tablet Initiatives

This isn’t a review of large scale educational initiatives in tablet computing in schools, but an extended rant. That the rant is justified is also very clear. What ever happened to the idea that policy should be evidence based and rigorously analysed? There is precious little evidence in this study that it is part of global approaches to technology in education. Massive funds are being spent without any clarity about why or what the outcomes were.

The study starts by admitting that there’s plenty of evidence that technology can enhance outcomes. Tablets are currently the most fashionable educational technology initiative. The study sets out to answer these questions:

· What explicit and implicit factors are motivating governments to launch tablet initiatives?

· What financial and organisational models are governments using to implement their tablet initiatives?

· What are the intended educational outcomes of the tablet initiatives?

· To what extent are the tablet initiatives aligned with educational policies and strategies?

· To what extent has the use of tablets been integrated with the curriculum?

· What provisions have been made to develop or provide access to relevant educational content on the tablets?

· What provisions have been made for teacher, student and parent preparation for the use of the tablets?

For an overall conclusion this is pretty damning; “the task proved to be more challenging than expected because of the limited amount of publicly available information, the overall findings of the review confirm the original assumption: that the majority of the tablet initiatives are launched with a hasty and uncalculated approach, often weak on the educational, financial or policy front.” p. 21

Questions and Answers

What explicit and implicit factors are motivating governments to launch tablet initiatives?

The report is very scathing, “the stated objectives included catchphrases and buzzwords that may have been more fitting for public relations and political campaigns than for educational reform actions” p. 23

What financial and organisational models are governments using to implement their tablet initiatives?

The report indicates that published material about this aspect of these initiatives was very limited. It points out some enormous discrepancies, for example both Jamaica and Turkey spent $1.4 billion on tablets but the former supported only 24,000 students whereas the latter helped over 10 million. The report doesn’t analyse this further but it’s hard to understand how Turkey managed to achieve anything significant with $140 per student. Equally it is hard to see how Jamaica invested c. $58000 per student even with training and infrastructure spending.

Educational Factors

Probably because they could find so little hard data the final five questions collapse into one section in the report. They don’t mince their words “the initiatives focused on the hype around tablets and not on their use as a tool to achieve an educational goal” p. 24.

This is very frustrating for someone involved in educational technology. Clearly governments are spending money on tablets but without any transparency about the educational aims, financial systems or impact these projects involve. It is easy to assume that’s because the thinking hasn’t been done. It’s also a criminal waste of money to carry through these projects without maximising the learning they generate.

Ha├čler, B., Major, L. & Hennessy, S., Tablet use in schools: A critical review of the evidence for learning outcomes, Journal of Computer Assisted Learning, June 2015

Just when I was despairing about tablets, education and intelligent analysis this paper came to my attention.

Unlike the work on large scale tablet initiatives it is much smaller scale and teacher facing in its focus and all the better for it. The study attempts to uncover research that looks closely at how tablets can impact on learning. As it says “the fragmented nature of the current knowledge base, and the scarcity of rigorous studies, make it difficult to draw firm conclusions” (p. 1). But there are some interesting pointers and useful distinctions in this study that make it worth reading.

The authors carried out a search of published material using strict criteria that gave them just 23 studies. Four of these involved less than 10 subjects. These limitations are indicative of the lack of rigorous research on this topic.

The published studies were mostly positive about the impact of tablets:

· 16 reported positive learning outcomes;

· 5 reported no difference; and

· 2 reported negative outcomes.

Looking at the positive results the paper finds a number of factors that seem to contribute to successful outcomes. These are:

· high usability and integration of multiple features within one device;

· easy customisation and supporting inclusion;

· touch screen; and

· availability and portability.

The authors delineate some practical considerations. It won’t surprise anyone to read that “effective technology management is critical to the successful introduction of tablets and this should be underpinned by sound change management principles” (p. 13). Also it seems evident that “a robust wireless infrastructure, with sufficient capacity to accommodate entire classes of tablets connecting simultaneously” (p. 14) is essential. Good cases are needed for “younger children” (P. 14).

Less tangible factors are also identified. They state that “a supportive school culture that fosters collegiality and teacher empowerment at different levels can be pivotal for the effective introduction of tablets” (p. 13).

It is interesting how differing schemes for distributing tablets impacted on outcomes. “In the one-to-one setting [that is one tablet per student], there is no competition for tablets among students, and in the studies reviewed there was consistently high group participation, improved communication and interaction. However, the many-to-one groups [i.e. many students to one tablet] generated superior artefacts as all the notes were well discussed among the group members” (p. 15). This finding challenges the common sense idea that one-to-one schemes are better.

The authors practical approach is admirable, for example they note that the “trade-off between number of devices, screen size, cost, and corresponding effective learning scenarios, remains completely unexplored in the research literature” (p. 18). They point a fruitful forward path for tablet research that focuses in on the classroom and school issues that might make the difference between the success or the failure of a procurement.

Kevin Burden, Paul Hopkins, Dr Trevor Male, Dr Stewart Martin, Christine Trala, iPad Scotland Evaluation, 2012

This piece of work is now quite old. The title may give the impression that this is wide study. It isn't. The evidence for the work comes from just 3 secondaries and 5 primaries, mostly authorities in or neighbouring Edinburgh (apart from one school in Aberdeen). The work involved interviews with staff and students. There wasn't any quantitative data.

They found that "teachers noted that ubiquitous access to the Internet and other knowledge tools associated with the iPad altered the dynamics of their classroom and enabled a wider range of learning activities to routinely occur than had been possible previously". Having iPads also "encouraged many teachers to explore alternative activities and forms of assessment for learning".

Students were found to show "increasing student levels of motivation, interest and engagement", "greater student autonomy and self-efficacy" and "more responsibility for their own learning".

Apparently "little formal training or tuition to use the devices was required by teachers; they learned experientially through play and through collaboration with colleagues and students".

There is a great deal more detail in the study and if you are considering iPads it is worth reading in full.

Finally I should reference Donald Clark's unrivalled unequivocal dislike of tablets in schools, for a counter to any positive impressions created above, for example here.
Add to Technorati Favorites