After writing about the new UNESCO ICT Competency Framework for Teachers yesterday, I spent time reading through the model, and thinking about how the information could be applied. I found some sections which were really useful in the context of Business Intelligence in education - how schools use their own data to improve students’ learning. In Appendix 2 of the document, there’s an example syllabus for Technology Literacy, and there are two specific sections which deal with using data to manage and monitor student performance. As well as scoping out what they mean, there’s a really useful list of best practices and obstacles, as well as common and critical mistakes. If you’re thinking of developing a business intelligence (BI) project in education, bringing together a varied mix of learning and assessment data to create a comprehensive picture, it’s worth reading the list of common and critical mistakes to learn from the experiences of others.
Published case studies rarely focus on the mistakes made during a project - but the broad base of contributors to the UNESCO framework means that it captures lessons from many projects, from many countries.
What I’ve done is to pull out the key sections on data use for monitoring student performance from two different sections of the document - Curriculum & Assessment (2.4) and ICT (4.6) - and summarise them together to help to get the full picture. The sections cover the selection of a tool to monitor and share student performance data, and using software to manage student and classroom data (from pages 49 and 57 of the Competency Framework document)
Scope of what’s covered
Using the UNESCO framework, I’ve combined the two scope elements into :
|Using ICT to record, manage and report on student performance data (grades, portfolios of student work, recognition of student achievement, reports to students, parents and administration). Includes use of standalone and networked software; use of spreadsheets; use of school management system (for the purposes of attendance, record keeping, grades, student enrolment, time tables etc.)|
So if that’s the challenge, what are the key nuggets that the document contains? And how can we apply it into a project rolling out a system for business intelligence in education? Well, there’s some key issues - advice, obstacles and mistakes - that it identifies from the projects that have been looked at. If you look at your projects (or your plans for future projects), how many of these areas can you feel confident about - and are there extra things that you can do to reinforce the good practice, and minimise the risk of mistakes?
The data from the UNESCO report is in blue - with my additional comments in italics below each section
Best practice advice
- Creating a culture of data-quality
- Keeping up-to-date with data-entry
- Using data from a wide variety of sources to monitor performance: use different types of assessment, comparisons with other students, teachers or schools
- Using ICT-based systems to improve parent involvement through better information flow to them
- Making use of the improved information which ICT-based systems can provide, for example early indicators of a failing student or teacher revealed by timely and detailed ICT records of grades
Using data effectively is a journey, not a single final destination. As good practice will evolve, it’s okay to start with something that isn’t yet ideal - for example, using just a single source of data to monitor performance initially, as you move to make more of your data usable, and extend the way it’s used. Similarly, if you want to improve parental communication, but don’t have much in your existing systems that you can share, start with a little information and increase it as you go along. Don’t wait for everything to be collated, databased and analysed before starting to use the data. Use the best practice advice to set your direction, and then tackle the task in steps. eg you might aspire to get your teachers to enter all of their markbook data online, but your first steps might involve creating the culture of data-quality and data-entry, rather than mandating everything’s in your system.
- Lack of hardware, software resources and financial resources
- Lack of culture of accountability
In Australian schools, the first obstacle is probably not a serious impediment - there will be enough resources available. The culture of accountability will vary between individual educational institutions, and you’ll need to ensure that any change plan allows sufficient time and focus to ensure that there’s a complete buy-in from staff - leaders, admin and teachers - to achieve your (and their) end goals
- Incorrect data entry (including incomplete data)
- Poor data management skills
- Not keeping passwords secure
- Incorrect formulae to calculate results
- “Garbage in garbage out”
- Not verifying the captured data
- Incorrect formulae or analysis (for example, selecting the wrong type of graph for a report)
So let’s be positive - these are common mistakes, which means that you’re likely to make one or more of them. The good news is that you’ve got the list - based on other people’s projects - to use as a sanity check when your education BI project turns up some bizarre data quirks. When you’re surprised to find that all of class 6W are mini-Einsteins, don’t be surprised to find out that their teacher was using a 1-5 scale, when the rest of the teachers were using a 5-1 scale.
If you treat the early stages of a project as a learning journey, then all the staff can learn together, and iron out the wrinkles before it does any harm!
- Not keeping confidential information secure
- Allowing vulnerability to hackers
- Incorrect conclusions from inaccurate data
- Inaction in the face of available data (failing to use the information provided by ICT-based system because such information did not previously exist)
- Not having backups of the data
These mistakes are important because they could derail your project. Making a ‘common mistake’ eg having a report that throws up wrong answers might demotivate the team, and cause people to question what’s going on. But making a ‘critical mistake’, like a lack of data security, may well derail the project on it’s first day, and cause the whole thing to be stopped. Again, the benefit of having the list is that you can use it as a project tick list:
- Is the data secure? Tick.
- Is the data backed-up? Tick.
- Will we cross-check critical data before it impacts on decision that affects student’s learning? Tick.
In some ways these are the hygiene factors which have to be right and will hit you immediately if they are wrong. The hidden one is number 4 - will you do something with the data? For example, if you find out that one teacher has a significant impact on exam results, will you find a way to use that info to benefit all students? And if you find that one course module that you’re all attached to produces poor results, will you simply live with it, or will you improve or drop it?
How will you use this information?
Now you’ve got the info, how does it help you? Is it something to go into your project plan? Or if you’re buying a BI system, is it the question list you use to test all of the suppliers? Or if you’re already using data well, does it help you to define your next step?
If it helps, I’ve quickly dropped the bullet points above into a series of PowerPoint slides that might help when talking with colleagues. I’ve called it “Common Mistakes in BI projects in Education”