Measurement, Evaluation and Change
This history gives a flavour of my journey through technology development, commercialisation, and transformational change to arrive at an understanding of the importance of measurement and evaluation in enabling change. It begins in real world applications of physical measurement and ends in my current work with the Bayswater Institute where we are partnering with a range of organisations to support change and the measurement of outcomes that matter to people.
Computing in the 1960s
Growing up in Birmingham, England in the 60s and 70s was a turbulent time. I went to an inner city comprehensive school called Jaffray in a suburb of Birmingham called Erdington. I remember being fifteen during the “Winter of Discontent” in 1978. The attempts at pay restraint by James Callaghan’s government and the strikes that followed led to the election of Margaret Thatcher in 1979. I remember the furore caused the Sex Pistols appearing on the Bill Grundy Today Show in 1976. I left Jaffray in 1979 to go to college across the other side of Birmingham in Bourneville. The college there was the only place I could study pure and applied mathematics as separate subjects. Before leaving school one of my science teachers at Jaffray had brought into the school a Nascom 1, self-built computer. The specifications make interesting reading compared to what we have some thirty-eight years later:
- A real keyboard with button keys.
- A 4 MHz Z80A processor
- 1 KB static RAM
- 2 KB Monitor (NAS-SYS 1)
- 8KB Microsoft Basic
- A 48×16 video interface with a modulator output to drive a domestic TV
- A serial in/out port selectable between cassette tape, RS-232 or current loop for teletypewriters.
I wasn’t quite sure why I became obsessed with the idea of a computer but I thought they had potential for many applications. This was also another reason for travelling across Birmingham every day, Bourneville College had a computer laboratory. Having seen the Nascom 1, I immediately started going to night school to learn about computers. Even though there was a teacher at Jaffray who had a personal interest there was no curriculum or structured approach to teaching computing at secondary school level. Therefore, I learned COBOL on Thursday nights at a local night school. I often walked home so that I could spend the bus fare on a can of Breaker Malt Liquor. Good times.
I had targeted the computing department at Bourneville but there was a problem. By the time I went to college there was an A’ level in computing but I had no O’ level. I had my night school qualification but the head of computing said that this wasn’t enough to get onto the A’ level course. I made a significant life decision at this point. I said I did not accept his assessment and that I was coming to the lessons anyway. I suggested that if I struggled after the first few weeks I would withdraw from the course. Initially both tutors on the course resisted my presence. I fought back by answering all of the questions asked in every lesson. After the first few weeks they relented and I spent a lot of time in the computer laboratory learning about Z80 and 6502 based machines. One of the reasons they resisted was that the course was unusual in that it was a one-year A’ level (probably due to the development in computers moving so rapidly.) By the end of the first year I had written a new admissions system for the college along with one of the tutors. Running in Basic on a Research Machines 380Z it used a DEC VT50 terminal to present enrolment information, pulling course codes from a random access database. Cutting edge for 1981. I sat the A’ level after one year and passed. I left Bourneville in 1982 with four mediocre A’ levels.
Sociotechncial Systems and Agile Project Management
What I did not realise at the time in Bourneville was that the admissions system we developed was my first brush with sociotechnical systems. We had written the system from scratch and it was purpose built to eradicate the paper shuffling required in the first few weeks of term when people were registering and changing courses. It generated reports by course and student. It could also do analysis of departmental numbers and facilitate monitoring of course numbers against limited places. The course tutor acted as the project sponsor in the college and I was the coder. We worked very closely and the specification was almost developed between us as we implemented the system. An early form of agile project management?
In 1982 I went to Preston Polytechnic (subsequently to become University of Central Lancashire.) Once again I had chosen the location based on what I wanted to study. I was able to get onto the physics and mathematics joint honours degree and study astronomy in the first year. I loved the first year and chose to focus on physics and mathematics. In the second and third years I gradually lost my way and my performance deteriorated, except in one area. For my third year project I worked in collaboration with the physics and chemistry departments and wrote a computer simulation of liquid crystal molecules. Written in Fortran IV the simulation successfully predicted phase transitions in an assembly of molecules. It also began to extend the simulation into disrupting the phase transition with magnetic fields. This allowed me to publish my first paper in 1987 entitled “A Monte-Carlo Study of a System of Anisometrically Interacting Particles.” I had specialised in my third year physics in magnetism. The physics department was very strong in magnetic recording media research. However, I failed my third year exams for the degree. This left the University with a dilemma and me with a huge issue about failure. My third year project was well on the way to an M.Phil. but I had not achieved graduate status. I can only say that some remarkable people rallied around to ensure that I got a second chance. I was permitted to stay registered at the University with the opportunity to re-sit the exams the following year. I even managed to pick up some teaching on a Higher National Diploma course and to engage in research.
Healthcare and Technology
In 1985 to 1986, as I waited to re-sit, I taught mathematics and picked up a research project in using magnetic fields to probe the human thorax. The project was in collaboration with Swansea University and focussed on measuring very low level magnetic field disturbances to detect pulmonary embolisms. By using a device called a Superconducting Quantum Interference Device or SQUID. It was hoped that the need for MRI scans could be reduced by using a cheaper approach. My role was to produce the computer model of the magnetic system and predict the field around the thorax with and without the presence of an embolism. The predictions were then tested in Swansea using a real SQUID. This resulted in two publications in 1988 one entitled “Rapid methods for the calculation of the magnetic fields associated with the human thorax” and the other “Design and Assessment of SQUID Magnetometers Using Reciprocity Methods.” Although the potential to detect embolisms was demonstrated this approach did not achieve mainstream adoption and diffusion. A clue I should have picked up along the way.
Magnetic Storage and User Centered Design
During 1985 I also became involved in a project with ICI in Runcorn. They were developing an electroless deposition process for manufacture of thin film magnetic recording media. Such films were generally manufactured using sputtering technology in vacuum chambers. The nature of the process dictated that manufacturing would be batch based. Electroless deposition held the potential of being a continuous process. The project required the bringing together of chemists, physicists, process engineers and technicians. I worked in two areas of measurement central to characterising the resultant media. I had responsibility for the running and management of a measurement device called a vibrating sample magnetometer (VSM.) This allowed the magnetic characteristics of the tape to be measured. This was the period when the first IBM computers were released and the system was based on an early version. It had a Hercules graphics card (graphics were not native to computers then.) It interfaced to the magnetometer through some home-made electronics developed within the University. The magnetic field was controlled using a stepper motor turning the rotor of a variable power supply. The magnetic field polarity was switched using a relay. A crude set up by today’s standards but even this level of automation transformed the ability to run many samples during the day. My role was two-fold – I ran the measurement on samples of tape that came out of the chemistry process and I developed the magnetometer to include new measurements as the sophistication of the chemistry increased. The introduction of new measurements was an interaction between the chemists trying to understand how the chemistry was changing the formulation of the thin film grains in cooperation with the physicists. The physicists would suggest new measurements and I would implement the code to perform the measurements, then the chemists would see if the new measurements correlated with how they thought the chemistry was changing the physical structure. The understanding was then triangulated with various beam scattering experiments that gave more information about the structure of the thin film layers. It was in the implementation of these new measurements that I translated the needs of the chemists and physicists into changes in the operation of the magnetometer and recorded and presented the data in a way that was fit for purpose. This user-centered design was iterative and required real understanding of what the customer (the chemists and physicists) needed. My work was latterly extended to undertaking tribology or wear based measurements on the films. As the chemistry got closer to the required structural and hence magnetic characteristics the requirement for the tape to perform under wear in a video recorder needed to be tested. This required different lubrication and hardening layers to be used to produce the right wear characteristics without changing the magnetic characteristics.
Collaboration and Culture
Only in later years did I come to realise what a seminal experience the work at ICI on magnetic media was. It represented a true collaboration between a group of experts in a broad set of disciplines. The only goal was to make the right tape. There was little ego at work and everyone was pulling in the same direction. I then learned something really important about how such projects are managed within big corporations. Initially, the chemistry was being developed in the laboratory and all the measurements were on small samples. As the chemistry, magnetic and wear characteristics approached what appeared to be a workable product a pilot plant was built. This was a small scale version of what the final manufacturing plant would be. At the point where the pilot plant had been built and the first samples were coming off the production process the project was cancelled. The project was closed down within months. Because of my youth (I was still only 23) I did not recognise how important the social and cultural aspects of the project were. I took for granted the collaborative approach and learned later that this was sociotechnical practice at its peak. We were mixing pure and applied chemistry and physics with development of computer based measurements. It gave me the opportunity to develop skills in:
- Data acquisition
- Computer control of systems
- User centred design
- User experience design
It is a testament to the project that some thirty years on I can still bring to mind the names of many of the key players in the project. All those involved mourned the passing of this project.
Research in Pure and Applied Physics
I passed my degree in 1986 and wrote up the work I had undertaken on the human thorax modelling. This enabled me to register for a Ph.D. and I moved, full time, into research on magnetic storage. This ranged from thin films used in hard disk drives to magnetic particles used in magnetic tape storage. The thrust of the work was related to the development of measurements we had undertaken with ICI. I developed computer simulations of magnetic recording media that allowed predictions of how various measurements would reveal aspects of the underlying structure of the particles or film. By collaborating with applied physicists we built complementary measurement platforms and computer models to increase the understanding of how the chemistry and deposition processes affected the magnetic characteristics. This led to two further publications in 1989 and 1990. The first was “Transverse Susceptibility of a Fine Particle System” the second “Calculation of Time Dependence in Thin Films.”
My supervisor for my Ph.D. was Prof. Roy Chantrell. He has many admirable qualities but one that is outstanding is his commitment to collaboration. My Ph.D. was part sponsored by Bayer AG in Germany and through this route I began to collaborate with an applied physics research group at the Ruhr University in Bochum, Germany. As I approached the end of my Ph.D. I was offered a position at the Ruhr University to continue working with the applied physicists on ferromagnetic resonance studies whilst writing up. I therefore, lived in Bochum for a period up until my viva in 1992. This resulted in a publication “Ferromagnetic Resonance Investigations of Particulate Magnetic Recording Tapes.”
Small and Medium Sized Enterprises (SMEs) and Start Ups
I obtained my Ph.D. entitled “Computer Simulations of Particulate Recording Media” in 1992. The same year I had an offer to join a small enterprise in mid-Wales that developed, manufactured and sold measurement systems primarily in the area of magnetic recording tape. Recent changes in the management of the small company had led to a need for a research and development manager with the intent of catalysing innovation. This meant the atmosphere was very much like a technology start-up. The company was located in mid-Wales due to the generous conditions on offer from the development agency (a non-governmental organisation that used to provide support for developing regions.) This gave us access to funding to undertake research and development work to expand and explore new markets. Initially we developed measurement systems in collaboration with companies such as Sony, Technicolour and Rank that were focused on VHS recording tape. In working with these companies I adopted similar approaches to the work we had done at ICI. The difference here was that we were not involved in research for a multi-national company but were selling high-end, fairly bespoke systems to commercial organisations.
Commercialisation and the Need for Sustainability
In this small mid-Wales start-up there was always a tension between the development costs and the realistic price we could achieve for each system. Through grants and sales we expanded the research and development team to six people and developed a range of automated measurement systems. This involved working closely with customers in what I now understand to be an action research type approach to iterating solutions to meet their needs. User interfaces and reporting were developed that facilitated the customer’s production systems and integrated with their workflows. However, the business was very dependent on these high-end, sometimes one-off systems and lacked regular cash flow. This meant we were always running to stand still. Small companies often sell from a specification when the product does not actually exist. The knowledge that it can be developed overrides any concern about taking the order. There were several cases of making commitments to deliver complex systems before work had even commenced on their development.
Product Development – What Outcome are you Looking for?
We developed and sold a growing range of systems for measurement that were often quite bespoke. The toughest case of this was a void detector sold to a Korean company. The specification sold pushed all of the available technology to its’ absolute limit. The system consisted of multiple, visible red lasers forming a staggered knife edge beam that passed through magnetic tape being manufactured at ten meters per second. The tape was in the form of a web approximately a meter wide. Detectors on the opposite side of the tape to the lasers measured the light transmitted and monitored for defects and the opacity of the coating. In this way continuous quality monitoring could occur. To be able to measure down to the minimum defect size in the specification the light level monitoring had to be processed in parallel. Hence each 12.7mm section of web was monitored by a separate digital signal processor (DSP.) Hence there were around 80 individual processors taking in data about each 12.7 mm of tape. This data was processed locally by each DSP and summary results across the whole width were passed to a single microprocessor. This single microprocessor further analysed defects and sent a summary of the web to a PC every second. The PC displayed the web travelling past the head on a graphic display showing defects and the overall opacity. It also produced summary reports for the whole roll and for each type of defect. Every part of the system was working at its maximum capacity. When the customer finally witnessed the system capability in action they were concerned by its sensitivity and asked that we desensitised it to defects. A perfect example of leading with a specification that was not what the customer required. In the process we made the system far harder to develop and deliver than it needed to be. The valuable lesson learned during this time was to fully explore and understand the outcomes the customer was expecting. Over-promising and under-delivering was a continual source of issues for the company.
Research as Part of Product Development
During the development of the magnetic testing systems I continued to collaborate with several academics at Universities in the UK. This resulted in five more publications between 1992 and 1995. These publications were:
“Experimental and theoretical studies of transverse susceptibility in recording media”
“The reversible transverse susceptibility of particulate recording media”
“A neural network approach to the determination of anisotropy distributions”
“Deconvolution of anisotropy field distributions from transverse susceptibility measurements” and “Application of neural networks to the determination of HK distributions from transverse susceptibility data”
These publications arose out of unpublished work from my Ph.D. and new work that evolved as part of the technology developments within the company.
Diversification and Optical Media
Through a technology transfer of optical media technology from a Dutch company we diversified into CD and later DVD testing. Although much of the data acquisition and user interface concerns were similar, the physical artefact and parameters were completely different for optical media compared to the magnetic media we knew well. The technology transfer introduced me to many cultural and sociotechnical factors in action. We took over the CD designs, learnt the specifications for the quality parameters and began to innovate new measurements all in less than a year. This introduced me to the subtleties of quality control and quality assurance. In the early days of a new technology, such as the introduction of DVD, it is no use to just measure parameters for quality assurance. When something is going wrong it is important to have broader measures and understanding of the parameters of the disc. Diagnosing the problem and relating this back to the manufacturing process is essential in rapidly getting production back on specification. This is quality control and having the broadest range of measurements become key. When manufacturing is well established and stable the key parameters become the monitors of quality assurance. Collaborating with the manufacturers and developing the measurement systems that gave this depth of understanding to provide quality control was essential in the early years of a new media.
Format Wars and Cultural Issues
We began with CD test systems but the next generation of optical disc was in development. It is interesting to briefly consider these early years of the 1990s as a format war was occurring over the replacement for CD and the introduction of movies on optical media. Often when technology moves on and a new consumer product becomes viable the dominant consumer electronics companies line up behind competing standards that are incompatible. The often cited example of this was in 1976 and the competition for the analogue video market between VHS, Betamax and V2000. However, there have been many examples through history including competition around railway gauges through to Internet standards such as HTML5 and H264 video encoding/decoding. In the 1990s the giants in consumer electronics were backing Multimedia Compact Disc (MMCD – an evolution of the audio CD) and Super Density disc (SD) as the media to introduce digital video. At this time we were extending our technology beyond audio CD to try to develop quality control equipment for both technologies. In the end, rather than have a war, a compromise was arrived at where the standards were merged to produce Digital Versatile Disc or DVD. The time spent fighting had allowed us to build collaborations with a range of organisations including a key player in bringing the standards together. Through our work with Warner Advanced Media Operations in the U.S. and their closeness to Toshiba in Japan we were able to develop a test system very early in the establishing of the DVD standard. We had direct access to the development of the disc standards and the interoperability testing that was key in those early years. We became one of the few companies worldwide that had automated quality control systems for DVD manufacturing. This resulted in a major sale to the U.S. company against competitors from Sweden, the U.S. and Austria. It also raised the profile of the company resulting in a collaboration with a U.S. optical disc inspection company and a merger occurred with a German optical inspection equipment company. In less than ten years the company went from a start-up to an international merger and listing on the German Neumarkt exchange.
Sales and Marketing and Globalisation
As we engaged with the DVD technology and the importance of magnetic media waned my role became more sales oriented, although I was still feeding back a lot of technical material to the research and development group. I travelled, engaging with the media manufacturers and demonstrated the system we were developing. It was during this time that I experienced a new set of predominantly cultural issues. The new company had Dutch, German and British senior management. The evolution of the DVD market also moved much more rapidly than previous technologies. Globalisation had a huge impact on DVD manufacturing. In the days of VHS, manufacturing was dispersed across geographies and the need for quality control across all of these locations meant there was a healthy market in test equipment. Within a few years DVD manufacturing consolidated into low cost countries. The ability to include several languages and cover greater regions with a single disc meant that manufacturing was less specific to a region. The stability of the process evolved rapidly and became more about quality assurance. The market for test equipment declined much more rapidly as manufacturing consolidated.
Around 2001 I left to go to the U.S. to set up a sales agency specialising in optical media and printing inspection equipment. I remained in the U.S. for around two years and returned to the U.K. in 2002. The consolidation and concentration of manufacturing in optical media had resulted in an ever decreasing market for test equipment. I set up a consultancy in the U.K. in 2002 and undertook work in the automotive industry and produced some new developments in optical thin film measurements. I also undertook contract work for another consultancy in a business development role.
Health and Social Care and Communications Technology
For many years I had harboured the belief that telemetry and remote communication technology could assist in the support of patients at home and when mobile. I had also had a long-term interest in biology and healthcare. (I was also still irritated by failing my undergraduate degree.) I therefore signed up in 2004, with the Open University, to begin a degree in Life Sciences. I reasoned that a better understanding of biology would help me in working with medics and it gave me a chance to redeem myself. My work in physical media led to me joining a company in 2005 focused on copy protection technologies for the cable and satellite industries.
I graduated in 2008 with an upper second in Life Sciences. I had gained some interesting insights during the degree into some of the challenges in the life sciences sector. I had also built up great respect for the Open University and the quality of its staff and courses. I was inspired to sign up for their Masters in Business Administration because they had a specialist option in life sciences. I began immediately and although the specialist focus was targeted very much at the pharmaceutical aspects of business development it helped me understand many of the challenges surrounding change in healthcare.
In 2009, whilst at a trade show related to the satellite and cable industry I met up with a university friend I had shared a house with nearly twenty years earlier. He told me he was working for a consultancy that provided services to the major satellite and cable companies predominantly around set top boxes. We discussed what I was doing and he said that there was a workstream in his company looking at providing video conferencing to the home using the TV to support people remotely with long term conditions. This seemed like the perfect “in” to start exploring the use of technology in the home to support people to be more independent and to tailor the intervention to a whole range of disease conditions. I showed great interest and within a month was transitioning to the role of Director of Telecare for Red Embedded Design Ltd.
Putting Ghosts to Rest and Learning about Wicked Problems
In 2011 I graduated with a distinction in my Masters in Business Administration and finally put the ghosts of my failure at undergraduate level to bed. I also reassured myself that I was still capable of learning new things. The old dog could still learn new tricks. The next seven years taught me some important lessons that I have seen repeated by friends and colleagues. The first lesson I would put into words is that “just because you decide that you should follow a certain route to success, do not assume life will let you.” Secondly, “never assume that a problem has a solution.” I have become acutely aware of wicked problems. These are problems for which there is no “correct” answer. Generally, they respond to being addressed by mutating into a different problem. They are a facet of complexity. What they have taught me is that, even if you cannot overcome the whole problem, it is better to have moved things along by tackling part of it. Governments are particularly at the mercy of wicked problems. When developing policy, it often occurs that the outcomes are a weak version of what was sought. Often the outcomes are something that could not be predicted. This is particularly challenging for companies that try to help deliver the outcomes that the policy espouses only to find that what happens in practice subverts the approach. The specific experience of this occurred in trying to introduce remote care for people at home.
The first few years promoting the use of video in care provision taught me a lot about the nature of health and care delivery. It also taught me a lot about language. Initially we had thought, as technologists, that if we explained how the technology worked, what it could do and what outcomes were possible, it would be adopted and spread immediately leading to scale deployments. Many of our early discussions were around being able to scale up and moving rapidly enough to outrun any competition. These discussions now seem worse than naïve. They ignore the fact that the language of technology is not the language of care and that outcomes that are good for patients and their families are not how care gets paid for. The professionalisation of care over the years has served to fragment it into siloes. Then within these siloes the dominant mechanism for paying is not based on outcomes but activity. Hence, when you introduce a new way of delivering care that disrupts the current approach there is no way to transform it. The payment mechanisms lock in place the current delivery methods. Focus is on process not outcomes. This led to a publication in 2014, “A Socio-Technical Approach to Evidence Generation in the Use of Video Conferencing in Care Delivery”
Public Procurement and Payment by Results
We climbed a steep learning curve as we probed the delivery of primary, secondary and social care. We could demonstrate great outcomes in reducing current activity in the care system as people were supported at home. We could also show the people became less anxious, more confident and understood their conditions better. Remote care serves to allow patients to manage their own conditions better whilst reducing acute events. Exactly what the policies around care say is the desired result. What we discovered is that there is no way to pay for such improvements. Particularly as reducing traditional activity means someone in the current system gets paid less because they have less activity. A further challenge is around public procurement. EU competition rules dictate that any contract value exceeding around £100k in the public sector must go out for procurement through the EU Journal. The threshold for this requirement was set up decades ago. So, over the years, more and more economic activity has fallen into this procurement pipeline. This has two impacts:
In order to avoid a legal challenge, the specification of what is being procured has to be tightly defined.
The procurement exercise is costly to administer and can take up to three months.
This is the perfect brake on innovation. It impedes collaboration between technologists and the public sector and encourages large-scale procurement of off the shelf items.
The payment mechanism in the NHS was developed as “Payment by Results” a completely misleading term since the tariff pays for activity not results. Secondary care activity is paid for by Health Resource Group (HRG) codes. Each HRG code has a tariff that the hospital is reimbursed when that activity occurs. Hence a complex respiratory admission might carry a tariff of around £2,400. This pays for the admission process and a certain number of days in a hospital bed. Days after this number have another code that pays for further nights. Hence, if a person is kept at home and managed remotely the hospital loses tariff income. There are no tariffs for video. Hence remote care for these patients is a cost with no reimbursement mechanism. So payment by results applies if the only result is that the person is admitted to hospital.
As we explored primary, secondary and social care these two themes of procurement and payment for activity recurred. The money in care does not follow what the patient or their family wants it follows activity of the care providers. Consequently, stopping any activity is costly as it reduces income. Developing new ways of caring costs administration and time (procurement) and there is no reimbursement as tariffs do not exist.
It is not possible to explore all of the ways that we tried to develop a service that lowered some of the barriers identified here. It is material for a much bigger discussion. We continued to develop the technology and service offering between 2009 and 2016. We were successful in obtaining grants and some innovation funding. However, we were not able to get the organisation onto a sustainable financial footing. We needed around £500k per year in ongoing revenue to achieve a level of sustainability. This was not possible.
Better Outcomes for Learning Disabilities – Transforming Care
During 2009 – 2016 we bid for and won a project to extend the use of remote care technology to people with learning disabilities. This £1m project was funded by the Small Business Research Initiative in health overseen by InnovateUK. We integrated many aspects of remote care into a single service and undertook collaborative development with care providers and people with learning disabilities to address their aspirations. The result was a video service that included:
- Video calls through the TV to family and care providers.
- Integration between health and social care providers in communicating with each other and people being cared for.
- Multi-way video providing the ability to undertake multi-disciplinary remote meetings with people at home and in other locations.
- Integration of the technology with NHS and local authority networks making its use transparent across stakeholders.
- Inclusion of physical readings such as blood pressure, weight, temperature, pulse and oxygen levels.
- Reporting and measurement of patient outcomes.
- Ambient monitoring through interoperability with an Internet of Things platform. This meant that motion, temperature, door and other sensors could be integrated into the service.
- Use of tablets and phones to communicate and provide data to the service.
We had brought together aspects of telemedicine, telehealth, telecare and patient reported outcomes into a single integrated platform. This led to another publication in
We were unable to obtain a commitment for ongoing revenue. The whole approach was still treated as a project and after seven years had not become embedded in care provision.
Evaluation, Action Research, Social Value and Sociotechnical Systems
During the seven years developing and deploying remote care I learnt a lot about how to develop technology in cooperation with practice. It was only in retrospect that I identified a common theme to all of the work I had done – how do you embed technology in daily practice? My work with the Bayswater Institute arose out of a search I undertook for an organisation that brought together these concerns about outcomes, technology, practice and evaluation. I needed to bring in an evaluation partner for the BOLD-TC project but realised that the health economics alone would not support the transformational change. There had to be understanding of the sociotechnical nature of the interaction of the technology with practice. The approach also needed to see evaluation as part of the change mechanism that underpinned ongoing learning. This action research approach is not common in many evaluation approaches. I found few organisations that were tackling such real-world challenges. In order to deny the complexity of the real world I found organisations that:
- Treat project management as a linear process that is not accepting of learning.
- Straight-jacketed in their thinking of health economics by current reimbursement mechanisms. With no recognition of social value or outcomes for people and their families.
- See technology embedding as a top-down implementation strategy when this has repeatedly failed.
- Do not incorporate learning from every stage into their on-going methods of developing technology and practice.
After closing v-connect down in 2016 I decided to join the Bayswater Institute. My journey through technology development, commercialisation, and transformational change had given me the opportunity to arrive at an understanding of the importance of measurement and evaluation in enabling change. I believe that only by exploring real world implementations of technology in an iterative way that engages with the practice of usage can real learning be obtained. It is only using such approaches that the complexity of the world can be taken on and that technology can be implemented in a way that provides a good return on public money invested and delivers positive outcomes for people.
D. J. Reynolds , A. Hoare and M. C. Holmes A Monte-Carlo Study of a System of Anisometrically Interacting Particles
Molecular Crystals and Liquid Crystals Incorporating Nonlinear Optics, Pages 113-119 | Published online: 13 Dec 2006
A. Hoare, D. Melville, D. Rassi and V. Samadian Rapid methods for the calculation of the magnetic fields associated with the human thorax. IEEE Transactions on Magnetics (Volume: 24, Issue: 2, Mar 1988)
https://doi.org/10.1109/20.11665 Design and Assessment of Squid Magnetometers
B. H. Blott, B. S. Janday, D. Melville, A. Hoare, D. Rassi and V. Samadian Using Reciprocity Methods. J. Phys. Colloques 49 (1988) C8-2061-C8-2062
R. W. Chantrell, A. Hoare, D. Melville, H. J. Lutke-Stezkamp, S. Methfessel Transverse susceptibility of a fine particle system. IEEE Transactions on Magnetics (Volume 25, Issue 5, Sep 1989), pp 4216-4218
A. Lyberatos, R. W. Chantrell, A. Hoare Calculation of Time Dependence in Thin Films. IEEE Transactions on Magnetics (Volume 26, Issue 1, Sep 1990), pp 222-224 https://doi.org/10.1109/20.50540
Th. Orth, U. Netzelmann, B. Dean, A. Hoare, O. von Geisau, J. Pelzl, R. W. Chantrell, R. Veitch, H. Jakusch Ferromagnetic Resonance Investigations of Particulate Magnetic Recording Tapes. Journal of Magnetism and Magnetic Materials, Volume 101, Issues 1-3, October 1991, Pages 235-236
Sollis, P. M., Hoare, A., Peters, A., Orth, T., Bissell, P. R., Chantrell, R. W., Pelzl, J. Experimental and Theoretical Studies of Transverse Susceptibility in Recording Media. IEEE Transactions on Magnetics (Volume: 28, Issue: 5, Sept. 1992 )
Hoare, A., Chantrell, R. W., Schmitt, W., Eiling, A. The Reversible Transverse Susceptibility of Particulate Recording Media. Journal of Physics D: Applied Physics, Volume 26, Number 3
Jones, H. V., Duller, A. W. G., Chantrell, R. W., Hoare, A., Bissell, P. R. A Neural Network Approach to The Determination of Anisotropy Distributions. Journal of Physics D: Applied Physics, Volume 31, Number 21
Chantrell, R. W., Bissell, P. R., Sollis, P., Hoare, A., Orth, A. Deconvolution of Anisotropy Field Distributions from Transverse Susceptibility Measurements. Journal of Magnetism and Magnetic Materials, Volumes 177–181, Part 2, January 1998, Pages 894-895
Jones, H. V., Duller, A. W. G., Chantrell, R. W., Hoare, A., Bissell, P. R. Application of Neural Networks to the Determination of HK Distributions from Transverse Susceptibility Data. Journal of Magnetism and Magnetic Materials, Volume 193, Issues 1–3, March 1999, Pages 416–419
Hoare, A., Eason, K. A Socio-Technical Approach to Evidence Generation in the Use of Video Conferencing in Care Delivery. International Journal of Sociotechnology and Knowledge Development (IJSKD) 6(2), https://doi.org/10.4018/ijskd.2014040103
Hoare, A. Factors Affecting the Move to an eSystems Approach to Remote Care Delivery. Developments in eSystems Engineering (DeSE), 2016 9th International Conference on, https://doi.org/10.1109/DeSE.2016.3