I used to think that New Year’s resolutions were pretty much a waste of time. Most people that I know still do. I vow to be good; I resolve to quit eating; I resolve to quit smoking: I resolve to get more exercise. I resolve to stop being sarcastic. Yeah, yeah, and ...yeah.
But over the 5 or 6 years, the concept of resolutions started to make a little more sense. New Year's Resolutions are just a variation on the theme of setting Objectives and Goal.
I mentioned before that CMPT`s quality system has been certified to ISO9001: (2000 - 2008) since 2002. Every year in October we set quality and programmatic goals and objectives and then monitor to ensure that we meet our goals. When we started we did this so that we would continue to get certified, but after a while we saw the value in the exercise, and did it for us... to keep CMPT on path and on track. And it seems to actually be effective.
So it started to make sense that if we can do this positively and painlessly for a proficiency testing program, we should be able to do the same thing for the Program Office for Laboratory Quality Management, and for my teaching commitments, and also for the medical laboratory at which I worked as a medical director for Quality. And again, the experience reinforced itself by actually being productive.
Go figure. Those guys like Deming and Crosby actually knew what they were talking about.
And now that MMLQR goes though its first New Year, it is time to start setting some objectives for it too. Actually MMLQR has only been running for 6 months, but this is sort of like setting or re-setting an organization's fiscal or operational calendar.
Just to make sure we are on the same page, I use the term Objective to mean a large scale outcome that may take a number of steps to reach, and I use the term Goal to identify the individual steps. To be credible the objectives should be achievable and the goals should be measurable.
So here are my goals and objectives for MMLQR for calendar year 2011
Objective 1:
MMLQR will be recognized as an active and credible and readable website to promote diversity of opinion and discussion for the full range of issues that surround quality management of medical laboratories.
Goal 1
To meet the measure of "recognized" I will promote MMLQR to support current readership and to encourage a broader audience.
Goal 2
To meet the measure of ``active``, I will generate at least one new entry each week.
Goal 3
To meet the measure of ``credible`` I will actively seek review and comment by others with long term interest in medical laboratory Qualitology. This might be in the form of another shot at an electronic survey, or in the form of an external-internal audit.
Goal 4
To meet the measure of ``readable`` I will reduce the number of spelling and editing errors per entry. (This one will be tough because I am terrible at proof-reading).
Goal 5
To meet the measure of ``diversity of opinion and discussion" I will invite and publish comments and commentary of differing opinion (as long as words like "jerk" and "pinhead" are not included) .
Goal 6
To meet the measure of "full range of issues that surround quality management" I will expand the topics covered including more commentary on politics and quality partners, as I see it.
So that is a good start with an achievable objective and a series of measurable goals. MMLQR is a young start-up which has met reached a certain level of fragile stability (similar to how they describe the Canadian economy). Going to the next step will take the discipline of setting some objectives and goals and PDSA. We will see how I do by the end on the year.
See you in 2011.
m
A discussion site for folks interested in improving the quality of medical laboratories. Most will be the thoughts and vents of a long time player in the medical laboratory quality from many perspectives, complex and basic laboratories, developed and developing countries, research and new knowledge.
Tuesday, December 28, 2010
Sunday, December 26, 2010
Predictions in Qualitology - 2011
As the year slowly creeps to a close and the next year gallops forward, I start to wonder what 2011 will look like for medical laboratory quality.
1: Flavor of the decade?
It is unlikely that Quality is going off the agenda in the next while in health care. Across Canada the media has decided to keep an eye on the laboratories, waiting for the bad thing to happen. We are unlikely to disappoint.
This will put increasing pressure on provincial governments and maybe (but unlikely) Ottawa.
Organizations like the Royal College will have a hard time turning away.
The validity of Crosby will ring true in health: the costs of non-conformance, in NOT doing it right the first time, are too high. I don't know if we will ever actually achieve zero tolerance for error, but 2011 will be a pressure point.
2: Jobs, Jobs, Jobs
As the world comes out of recession and folks again have money to spend, the jobs will return. This will be true in both the private and the public sector. Near the top of the list will be the positions postponed or sacrificed along the way. People over machines. Buying bigger and better analyzers is unlikely to be seen as the best way forward. But institutions will still be cautious and a major priority will be effective use of money and more effective monitoring.
And that will mean more interest for more quality team positions.
Add to this, increasing political pressures (as mentioned above) and there will be even more Quality positions. Think of 2011 and 2012 as the equivalent of what SARS did for Infection Control.
3: Knowledge is King.
In Canada, we had our Royal College meeting and the importance of shared knowledge in Quality was seen as priority number ONE. In the US the audience for quality is growing, as is the number of laboratories seeking supplemental accreditation. The number of folks coming to our training programs is increasing from around the world. So the message is clear in healthcare that more folks need more information, and we will see the sharers of the that knowledge in more demand.
Again as mentioned above, the wave of new positions will require a wave of new educational opportunities.
Universities and Colleges will become more actively engaged in Quality.
4: Conferences - maybe
Organizations are increasingly leery of conferences as good vehicles for continuing education. Airfares, hotels, meals are very expensive. So how does this fit with Knowledge is King".
First the number of on-line courses and conferences and confabs will increase using a wider array of communication tools that will promote connectivity. Video conferencing, collective conversation,creative use of networking software will be an increasing part of the on-line education experience.
Second, when conferences are held, they will attracting an audience will be tougher as folks get more selective.
The conferences that survive will be fewer but better.
Successful conferences will be the ones in the right place, and the right time, with the right information, and the right contacts. For example, Vancouver in June at the POLQM Quality Weekend Workshop (visit www.POLQMWeekendWorkshop.ca )
So I'm looking forward towards an exciting year coming up with lots to keep us all busy.
See you next year!
m
1: Flavor of the decade?
It is unlikely that Quality is going off the agenda in the next while in health care. Across Canada the media has decided to keep an eye on the laboratories, waiting for the bad thing to happen. We are unlikely to disappoint.
This will put increasing pressure on provincial governments and maybe (but unlikely) Ottawa.
Organizations like the Royal College will have a hard time turning away.
The validity of Crosby will ring true in health: the costs of non-conformance, in NOT doing it right the first time, are too high. I don't know if we will ever actually achieve zero tolerance for error, but 2011 will be a pressure point.
2: Jobs, Jobs, Jobs
As the world comes out of recession and folks again have money to spend, the jobs will return. This will be true in both the private and the public sector. Near the top of the list will be the positions postponed or sacrificed along the way. People over machines. Buying bigger and better analyzers is unlikely to be seen as the best way forward. But institutions will still be cautious and a major priority will be effective use of money and more effective monitoring.
And that will mean more interest for more quality team positions.
Add to this, increasing political pressures (as mentioned above) and there will be even more Quality positions. Think of 2011 and 2012 as the equivalent of what SARS did for Infection Control.
3: Knowledge is King.
In Canada, we had our Royal College meeting and the importance of shared knowledge in Quality was seen as priority number ONE. In the US the audience for quality is growing, as is the number of laboratories seeking supplemental accreditation. The number of folks coming to our training programs is increasing from around the world. So the message is clear in healthcare that more folks need more information, and we will see the sharers of the that knowledge in more demand.
Again as mentioned above, the wave of new positions will require a wave of new educational opportunities.
Universities and Colleges will become more actively engaged in Quality.
4: Conferences - maybe
Organizations are increasingly leery of conferences as good vehicles for continuing education. Airfares, hotels, meals are very expensive. So how does this fit with Knowledge is King".
First the number of on-line courses and conferences and confabs will increase using a wider array of communication tools that will promote connectivity. Video conferencing, collective conversation,creative use of networking software will be an increasing part of the on-line education experience.
Second, when conferences are held, they will attracting an audience will be tougher as folks get more selective.
The conferences that survive will be fewer but better.
Successful conferences will be the ones in the right place, and the right time, with the right information, and the right contacts. For example, Vancouver in June at the POLQM Quality Weekend Workshop (visit www.POLQMWeekendWorkshop.ca )
So I'm looking forward towards an exciting year coming up with lots to keep us all busy.
See you next year!
m
Thursday, December 23, 2010
Message to self
Let me start by saying that I do a lot of surveys. To date my account has 82 surveys completed and an additional 3 currently active. I survey students regularly during courses, and annually we do at least one customer satisfaction survey for CMPT. I have done surveys for the International Organization for Standardization (ISO) and for International Laboratory Accreditation Cooperation (ILAC). Over the years I have become adept at creating surveys that address the issues that I want addressed.
Last week's survey was my first experiment of linking a survey to a discussion website like MMLQR. I would not call it a totally successful experiment.
When I look at the reported results, first I noted that we are attracting a variety of laboratory Quality professionals from Canada and internationally. There are some positive trends. Based on a 6-point Likert scale, this site was ranked either as Excellent or Good by near everyone with regard to variety of topics, relevance, clarity, accessibility, and refreshment. There were no "poor" or "unacceptable" responses. The same was true for the Overall assessment.
So this is all good, Yes?
Well it provides documentation that supports impressions based on the progressively increasing readership, and it confirms that the people that I am interested in engaging in conversation aer finding the site. But based upon the number of tracked page views, it looks like less than 4 percent of people connecting to MMLQR have responded to the survey.
With the information that I can garner, I don't know how many people opened the survey but chose to anwer no questions, but I assume that that is a very small number
So there is a problem, but a generalisable problem. Almost all the surveys I have done in the past have been to a closed or faily closed population, where I could go back to the group and try again and again. This is a survey to an open population. In that regard it is similar to attempting to do a satisfaction survey of people exiting a laboratory patient service centre, or of physicians that use laboratory services.
In all these situations, one can generate a denominator of how many potential responses there could have been. The challenge is how does one increase the numerator without generating bias, both positive and negative.
In the laboratory setting, one might try creating focus groups as definable groups, but one has to create incentives to garner participation. This is potentially expensive and would only be applicable if one risked confidentiality breaches. One could combine focus groups with electronic surveying, but still one would have to have identifiers to work with.
In this setting, I have no access to identifiers, and no obvious inducements that might entice a response.
So it is back to the drawing board with some questions to be asked, like who do I want to attract to the survey, and how can I entice them to participate, and what is a sufficient cluster, and what kinds of questions will capture the information that I want. And maybe to affirm why I want to generate the information.
It's kind of like my own PDSA
If I come up with some answers I will try again.
In the meantime if you are in the 4%, many thanks for participating.
In the meantime, I am going to take a few days off and come up with my predictions and resolutions for a happy and Quality 2011.
For those of you who celebrate the day, Merry Christmas.
m
Last week's survey was my first experiment of linking a survey to a discussion website like MMLQR. I would not call it a totally successful experiment.
When I look at the reported results, first I noted that we are attracting a variety of laboratory Quality professionals from Canada and internationally. There are some positive trends. Based on a 6-point Likert scale, this site was ranked either as Excellent or Good by near everyone with regard to variety of topics, relevance, clarity, accessibility, and refreshment. There were no "poor" or "unacceptable" responses. The same was true for the Overall assessment.
So this is all good, Yes?
Well it provides documentation that supports impressions based on the progressively increasing readership, and it confirms that the people that I am interested in engaging in conversation aer finding the site. But based upon the number of tracked page views, it looks like less than 4 percent of people connecting to MMLQR have responded to the survey.
With the information that I can garner, I don't know how many people opened the survey but chose to anwer no questions, but I assume that that is a very small number
So there is a problem, but a generalisable problem. Almost all the surveys I have done in the past have been to a closed or faily closed population, where I could go back to the group and try again and again. This is a survey to an open population. In that regard it is similar to attempting to do a satisfaction survey of people exiting a laboratory patient service centre, or of physicians that use laboratory services.
In all these situations, one can generate a denominator of how many potential responses there could have been. The challenge is how does one increase the numerator without generating bias, both positive and negative.
In the laboratory setting, one might try creating focus groups as definable groups, but one has to create incentives to garner participation. This is potentially expensive and would only be applicable if one risked confidentiality breaches. One could combine focus groups with electronic surveying, but still one would have to have identifiers to work with.
In this setting, I have no access to identifiers, and no obvious inducements that might entice a response.
So it is back to the drawing board with some questions to be asked, like who do I want to attract to the survey, and how can I entice them to participate, and what is a sufficient cluster, and what kinds of questions will capture the information that I want. And maybe to affirm why I want to generate the information.
It's kind of like my own PDSA
If I come up with some answers I will try again.
In the meantime if you are in the 4%, many thanks for participating.
In the meantime, I am going to take a few days off and come up with my predictions and resolutions for a happy and Quality 2011.
For those of you who celebrate the day, Merry Christmas.
m
Sunday, December 19, 2010
Preparing our next generation of leaders
I am preparing a number of new presentations for our Resident and Graduate Student Quality Seminar Series and took the opportunity to re-read Deming’s Out of the Crisis, written in 1982 to expand on the 14 Principles. In chapter 2 Principles for Transformation of Western Management he writes “Support of top management is not sufficient. It is not enough that top management commit themselves for life to quality and productivity. They must know what is is that they have committed to – that is, what they must do. The obligations can not be delegated. Support is not enough; action is required. “
This is a core message that I am going to convey.
Medical laboratories provide a broad variety of services including creation and provision of a menu of diagnostic tests, creation of a method for ordering tests (I hate that term: more on this later) and then providing a process for collection and transport. The samples get accessioned and tested and results get generated and reported.
Along the way some quality processes take place, including quality control and proficiency testing and sometimes accreditation. Some projects, like the application of “Lean”, or a Lean variant of time and motion get initiated and sometimes actually competed. But we all know that problems still continue. Over the last very few years we have seen pathologists misinterpreting and misreporting findings, faulty HIV testing, faulty tissue diagnostic tests, and pathologists with a pathological ineptness in writing reports. We also have well documented sample contaminations in chemistry and microbiology.
In some laboratories we are starting to do more. Quality management is starting to manifest some continual improvement processes (which is good) and even some error investigation (which is better).
So far I have told you nothing that you don’t know and that you have not heard before.
But here’s what our residents need to know and understand. That when push comes to shove all the above mentioned errors have a single primary root cause and it has little to do with technologist training or standard operating procedures, or competency assessment. Our primary root cause of laboratory error is the consistent and persisted absence of personal active engagement by our medical laboratory directors in Management Review and change.
Quality is not about hiring a Quality Head, however named, and then delegating authority. Management Review is not about being handed a bunch of manuals and annually signing them off, or about asking the Quality Guy how we are doing.
Our next generation of medical laboratory directors need to know that quality management is as much a part of their job as is reading pathology slides or signing our reports or dealing with human resources and budgets. And it is not about number of slides read per hour worked or number of INRs ordered per week. And it barely is about some artificial and artefactual measure of turnaround.
They need to know there is a quality expectation, a quality vision. Keeping laboratory staff focused on what matters is a critical function of the laboratory director . They need to know that satisfaction is not about making complaints go away, but is about comprehensive communication with hospital staff and patients.
They need to know that quality is a science of planning, execution, measurement and response. And they need to it is their job to make these things happen.
So it should be an interesting seminar series.
m
PS: I have a plan to see if we can monitor our impact on three scales: immediate, intermediate and at five years.
PPS: We have made some important changes to the Quality Weekend Workshop. Dr. Denise Dudzinski is a bioethicist who recently published an article in NEJM on Disclosure Dilemma – Large-Scale Adverse Events. I am looking forward to her presentation.
PPPS: I would really appreciate it if you can go back to the previous entry and fill in the survey.
Thursday, December 16, 2010
Quality Tools for a Quality Website
The Quality Tool Box is loaded with goodies, like Quality Indicators, Analysis through Pareto Charting or Sigma Calculations, Planning with Gantt Charts, time and motion studies (is that Lean or Taylorism?), and surveying for customer opinion. Over time, the qualitologist uses all these tools to monitor and improve the quality of process.
Today I am interested in your thoughts.
We have done 62 entries to MMLQR. Some brilliant, some maybe not so much. The point of the exercise has been to develop some themes like Cost and Performance and Quality Partners.
So as I go forward I am interested in knowing if I am striking a chord with readers. It is time for an opinion survey.
The enclosed survey should take no more than 4 minutes to complete. It will be available for about 2 weeks.
Your participation is much appreciated.
m
Today I am interested in your thoughts.
We have done 62 entries to MMLQR. Some brilliant, some maybe not so much. The point of the exercise has been to develop some themes like Cost and Performance and Quality Partners.
So as I go forward I am interested in knowing if I am striking a chord with readers. It is time for an opinion survey.
The enclosed survey should take no more than 4 minutes to complete. It will be available for about 2 weeks.
Your participation is much appreciated.
m
Create your free online surveys with SurveyMonkey, the world's leading questionnaire tool.
Tuesday, December 14, 2010
Big enough to fail?
WITH APOLOGIES, THIS IS ANOTHER RANT. RECENT EVENTS TELL ME THAT SOME THINGS ARE VERY RESISTANT TO CHANGE, EVEN WHEN WE KNOW BETTER. IT'S KIND OF LIKE SMOKING.
Question 1: How many laboratories in Canada (or indeed anywhere) have been closed down because they did not get a perfect score on their proficiency testing challenges? Answer: None. Never. Notta. Zip. Zero.
Question 1: How many laboratories in Canada (or indeed anywhere) have been closed down because they did not get a perfect score on their proficiency testing challenges? Answer: None. Never. Notta. Zip. Zero.
Question 2: How many laboratories have been allowed to remain open, but lost their ability to bill their provincial medical services program for services rendered because they did not get a perfect score on their proficiency testing challenges. Answer: see above.
Question 3: How many laboratories in Canada “game” their proficiency challenges by holding the sample back until the right technologist is available, or do repeat testing, or do extended testing, or send the sample to a referral laboratory. More than one, disturbingly a lot more than one.
Not to belabor the issue, but what is the point of the exercise? Are we really that insecure about our professional competence that we have to “cheat” when there is nothing on the table, nothing to gain. No gold stars, no extra cookies, no scholarship, nothing.
What we do lose is the opportunity to check that our standard operating procedures are getting us to the right answer. What we do lose is the opportunity to do a competency check on our operating systems. What we do lose is all the extra time, effort, energy and money is doing the extra testing, and scheduling. Consider that finding a single weakness through PT can save you poor quality costs equaling your laboratory's total PT costs for a whole year, or more.
There are solutions or work-arounds that can be put into place, but they are either expensive or inconvenient, or create increased, and from my perspective unacceptable risk.
We have been toying with the possible linking proficiency testing turnaround times to clinical sample turnaround times. If a sample should pass through the clinical laboratory in less than 24 hours, then that becomes the upper limit for the P.T. challenge. If it should take five days, then the PT turnaround limit is five days. It would not be particularly difficult to do, for programs with on-line entry of results. For paper or fax entry it may be a little more difficult. It would result in more paper work on both ends and that would result in some inconvenience and added cost.Another solution would be to create a link between the P.T. provider and the laboratory information systems, so that reports would be automatically generated and sent tot he provider. This would take some initial set-up time and save time on the laboratory side, but would, in most P.T. programs, cause increased time requirements to transfer results to the data base for analysis with an inevitable increase in cost. Additionally, it is stunning how unstandardized our reports are. But that is a topic for another time.
Or we could disguise the samples completely and have them submitted as clinical samples. But that would cause all sorts of challenges with getting site specific requisitions from each laboratory and creating patient names with appropriate identification numbers. And then we would have to sort through the problems created with billings. (A number of years ago when I was a resident, a microbiologist in the hospital created a throat swab with C. diphtheriae. It was a brilliant idea except that the result got reported to public health and much chaos ensued).
Or we could just get rid of proficiency testing, but most laboratories would lose an essential quality assessment tool and a major source of continuing education.
There is, I think, a more reasonable solution. Laboratory management including the quality manager decide that as of this day we do things differently. No more overwork, no more processing by the QC technologist. Routine processing only. It's worth a try.
Yes?
m
PS:
The POLQM Weekend Workshop is coming together well. For those interested in doing a poster or podium presentation, register early and send your abstract to ubcpolqm@gmail.com
Yes?
m
PS:
The POLQM Weekend Workshop is coming together well. For those interested in doing a poster or podium presentation, register early and send your abstract to ubcpolqm@gmail.com
Saturday, December 11, 2010
The Science of Qualitology
I like the ASQ’s Quality Management Journal because it publishes articles in a science and experimental structure that I understand and expect to see in a journal. The article that I was looking at was analyzing factors associated with Quality in hospital settings. (seeR.E Carter, S.C. Lonial, and P.S. Raju. 2010. Impact of Quality Management on Hospital Performance: An Empirical Investigation. QMJ. 17(4): 8-24).
The study design was based on a survey sent to hospital executives in 175 organizations in mid-US (Kentucky, Ohio, Tennessee, Minnesota, and Mississippi)). The surveys were sent to Hospital CEOs who were in turn supposed to pass them on to senior folks like the VP administration, Quality manager, Support services manager, Director of nursing. This was very ambitious.
The conclusions they came to were what I would expect; when it comes to quality size and stress matter. The more uncertainty in the institution, the larger the institution, the less likely they were to have “measurable” evidence of Quality.
The “measure” of Quality in this study looked at 5 markers for financial performance, 4 markers of market/service development and 4 markers of quality outcomes. That, in my opinion was a set unlikely to give a clear picture of hospital quality.
And that brings me to my point.
What are the objective measures that one can monitor as an indicator for success or failure for introduction of Quality activities in medical laboratories?
Not success in accreditation or proficiency testing scores. They are too readily manipulated (see M.A. Noble. 2007. Does External Evaluation of Laboratories Improve Patient Safety? Clinical Chemistry and Laboratory Medicine. Clin Chem Lab Med. 45(6):753-756).
What are the objective measures that one can monitor as an indicator for success or failure for introduction of Quality activities in medical laboratories?
Not success in accreditation or proficiency testing scores. They are too readily manipulated (see M.A. Noble. 2007. Does External Evaluation of Laboratories Improve Patient Safety? Clinical Chemistry and Laboratory Medicine. Clin Chem Lab Med. 45(6):753-756).
Not numbers of reported incidents or OFI’s. They are too open to flexible interpretation. OFI reports, if anything are like unemployment rates. A downward movement in rates may mean more people are being employed, or it may mean that fewer people are bothering to look. And a rise may mean more people are unemployed, or it may mean more people are hopeful and are again more actively looking. In the same way a rise in the OFI’s rate may mean more problems are being identified and reported meaning poorer Quality, or it may mean more engagement leading to more reporting meaning better Quality.
How about client or staff satisfaction? Maybe, but again, very manipulatable and too vague.
And in Canada, financial stability or instability are completely inappropriate since 99 percent (or more) of resources come from the government purse.
So we have a dilemma. For good studies we need measurable and interpretable and monitorable outcomes on both a micro- and macro- basis. We do this on a micro- scale all the time (call that Quality Indicators). But to move from interesting to convincing and compelling, we will need to define our macro- outcomes as well.
For Quality to create a lasting imprint in medical laboratories, we are going to have to speak the language of laboratory personnel, pathologists and technologists. We will need the language of science and experimentation. outcome and conclusion.
Any and all ideas are most certainly welcome.
m
PS: Absence of strong interpretable measures makes grant funding difficult, maybe impossible. I have learned this the hard way.
And in Canada, financial stability or instability are completely inappropriate since 99 percent (or more) of resources come from the government purse.
So we have a dilemma. For good studies we need measurable and interpretable and monitorable outcomes on both a micro- and macro- basis. We do this on a micro- scale all the time (call that Quality Indicators). But to move from interesting to convincing and compelling, we will need to define our macro- outcomes as well.
For Quality to create a lasting imprint in medical laboratories, we are going to have to speak the language of laboratory personnel, pathologists and technologists. We will need the language of science and experimentation. outcome and conclusion.
Any and all ideas are most certainly welcome.
m
PS: Absence of strong interpretable measures makes grant funding difficult, maybe impossible. I have learned this the hard way.
Tuesday, December 7, 2010
Implementing ISO 15189:2007
For folks interested in medical laboratory quality, it is not necessary to explain the number 15189. In the area of medical laboratory quality it is probably the most significant harmonizing document ever created. ISO 15189:2007 entitled "Medical laboratories - particular requirements for quality and competence" sets out a clear and comprehensive set of quality requirements for medical laboratories. Initially published in 2003 and is in place both for laboratory and accreditation body usage in over 70 countries. It is being used on every continent, with the possible exception of Antarctica.
Unfortunately, ISO 15189 shares many characteristics with other documents created by the International Organization for Standardization, and lots of other standards development bodies; it can be a challenge to understand and interpret, even in English, because, as they say, the devil is in the details, and understanding the details may require a certain subtlety and appreciation for nuance.
Enter the implementation guides.
There are a variety of implementation guides to assist with understanding this document. Some are in the form of courses, both on-line and in-person. Others are in the form of text books. One of the first published was created by the Canadian Standards Association (CSA) and was known as ISO15189:2003 Essentials. Written by members of the Canadian delegation that were actively involved in the creation of the standard, it was a small but extremely well received book, purchased and used around the world. It suffered from some editing issues, but the content was reviewed and supported by folks in the US, UK, and Australia.
Well, time went by and in 2007, the standard was republished, with some minor text changes. With passage of time and increasing usage of the standard, a lot of the content was open for revised interpretation and guidance and implementation advice.
So now, the 2nd edition is available through the Canadian Standards Association. The full title is: The ISO 15189:2007 essentials - A practical handbook for implementing the ISO 15189:2007 Standard for medical laboratories.
Again written and supported by the Canadian delegation to ISO TC 212 Working Group 1, it still contains the content of the standard, interpretation and guidance and tips on implementing the requirements. The guides and tips are designed for use in laboratories with a range of size and complexity. It will be helpful and useful for medical laboratories in many countries
Most or all of the editing issues that needed addressing in the first edition have been much improved. The book contains both table of contents, and an index, and contains all the annexes.
Again written and supported by the Canadian delegation to ISO TC 212 Working Group 1, it still contains the content of the standard, interpretation and guidance and tips on implementing the requirements. The guides and tips are designed for use in laboratories with a range of size and complexity. It will be helpful and useful for medical laboratories in many countries
Most or all of the editing issues that needed addressing in the first edition have been much improved. The book contains both table of contents, and an index, and contains all the annexes.
So the book is worth a read. It is a good book and I will be using it as part of the UBC Certificate Course in Laboratory Quality Management (see www.POLQM.ca).
It is available through the Canadian Standards Association web-site (www.ShopCSA.ca) at a fair and reasonable cost.
One of the themes that I have raised before is that in many countries, medical laboratory accreditation is NOT a requirement, and in some places where it is a requirement, there are allowable options for accreditation to requirements other than ISO15189:2007. But in many countries, medical laboratorians recognize the value of implementing this standard on a voluntary basis because of its superior quality management and its international application.
Some who go through voluntary implementation go further by applying for voluntary accreditation.
Both these steps provide great value to the laboratory that makes the commitment because ISO quality implementation and accreditation can be the basis of international recognition, business opportunity and all the benefits that accrue: higher satisfaction, lower costs, improved patient safety.
It is available through the Canadian Standards Association web-site (www.ShopCSA.ca) at a fair and reasonable cost.
One of the themes that I have raised before is that in many countries, medical laboratory accreditation is NOT a requirement, and in some places where it is a requirement, there are allowable options for accreditation to requirements other than ISO15189:2007. But in many countries, medical laboratorians recognize the value of implementing this standard on a voluntary basis because of its superior quality management and its international application.
Some who go through voluntary implementation go further by applying for voluntary accreditation.
Both these steps provide great value to the laboratory that makes the commitment because ISO quality implementation and accreditation can be the basis of international recognition, business opportunity and all the benefits that accrue: higher satisfaction, lower costs, improved patient safety.
I declare my personal connection with the book. I was the co-author of the first edition and the principle author of the new edition. It was written under a fixed contract though the University of British Columbia. Neither I, nor the UBC Program Office for Laboratory Quality Management, or the university receives payment in the form of royalties for present or future sales.
mPS: A third iteration of 15189:XXXX is underway and may be available in 2013.
Sunday, December 5, 2010
I Love This Bar
Some of you are probably not followers of country music, and don’t know Toby Keith or his song “I love this bar”. Part of the lyrics go “I love this bar. It's my kind of place just walkin' through the front door puts a big smile on my face. It ain't too far, come as you are. I love this bar.”
The song and sentiment is not about drinking, or about the location. It is about the people.
I raise this because in the last while I have had the time to ponder on being a medical qualitologist, I have come to the same conclusion. I really enjoy the community of qualitology.
Over the last while we have been working on a number of projects. UBC Certificate Course is starting to fill up, and our project with the I-TECH at the University of Washington is taking shape. And our seminar series of Quality Management for Residents in Laboratory Medicine is shaping up as well, as is the planning and progress for our CACMID/AMMI Symposium in April and the Quality Weekend Workshop in June. Plus, plus, plus.
It’s good that we are busy, but what makes it all the more enjoyable is that each of these projects connects us with more folks interested in the same things that we are interested in.
Folks in qualitology have a number of common characteristics. They tend to be very positive, and hopeful, and interested in making what they do relevant to better workplace management. And they see the direct link to better patient care.
Each of these projects will end up creating supportive dialogue and improved information that will make laboratories better and safer and more effective. And while I am looking forward to all of them, the seminar series for residents is top-of-mind. Residency training is part of the continuum of progressive narrowing of focus that starts in high school and progresses through undergraduate college, graduate studies, medical school and specialty training. I understand it, I am a product of it. But what is really clear is that being finely tuned in the science base of pathology does not prepare folks for how to run an effective laboratory. Our seminar series reverses that narrowing focus and says, “Hey folks…time to broaden out your attention. Quality is not an innate topic, and if you expect to be an effective manager of people and information, you need to know this stuff too. It’s call Management Responsibility and Management Review.
The series will include 10 presentations:
• Hour 1: Why Quality and Why Now
• Hour 2: History of Quality Management
• Hour 3: Quality Requirements
• Hour 4: Quality Standards
• Hour 5: Applying Laboratory Quality (1)
• Hour 6: Applying Laboratory Quality (2)
• Hour 7: Quality Partners
• Hour 8: Working with Quality Partners
• Hour 9: Costs of Poor Quality
• Hour 10 :The role of the physician laboratorian
If you are interested we will post the presentations as we go through January at www.POLQM.ca
M
PS: The POLQM Weekend Workshop in June is shaping up to be really exciting.
Visit www.POLQMWeekendWorkshop.ca for updates as they come.
Tuesday, November 30, 2010
Assessing the Assessors
A number of years ago (2000-2001) I was invited to give a presentation about my proficiency testing program. When I finished, I invited the audience for questions and comments. A technologist stood and angrily complained that proficiency testing and accreditation bodies set themselves up as authorities, but were not required to meet any requirements or expectations. She was absolutely right.
So in 2001 I decided that we had to fix this, and in 2002 we were thoroughly assessed and our organization was certified has having a quality management system that served us well. We have continued the process of external assessment and re certification ever since. In 2010 a new ISO standard for Proficiency Testing bodies was developed (ISO17043:2010). We are considering that recognition process as well.
My biggest regret was that I didn’t get the name or contact information of the technologist who woke me up to the critical importance of demonstrating commitment to quality.
I tell the story for two reasons: the first to brag, and the second to make the point that despite my regularly raising the issue accreditation bodies in Canada have been much slower off the mark. Indeed my above story could be almost as relevant today as it was 9 years ago.
To be fair, in Canada we have a distributed responsibility health system with each province responsible for its own health oversight, so there is no single oversight body that demands provincial accreditation bodies have themselves external assessed for quality and/or competence. But it is not a matter of requirement. It is a matter of obligation and commitment.
But now 2 provincial medical laboratory accreditation bodies (plus 1 more) have stepped up to the plate and achieved international recognition.
In 2004 the province of Ontario decided that it was time to get into laboratory inspections. It was timely because the new international standard ISO15189 as a standard for quality and competence of medical laboratories had just been published. The newly minted OLA (Ontario Laboratory Accreditation) arm of QMP-LS (Quality Management Program – Laboratory Services) created a standard that incorporated the new standard and others, to ensure that medical laboratories were implementing quality management systems (and other measures of competence). They have become world leaders in ISO15189 accreditation. ISO Accreditation in Canada is done officially under the authority of Standard Council of Canada, through its signatory relationship with the International Laboratory Accreditation Cooperation (ILAC).
In 2010, my province of British Columbia, going a different route has none the less achieved international recognition through accreditation of its laboratory accreditation standards by the International Society for Quality in Health-care (ISQuA). This is a complex process similar to the certification process that I underwent with ISO9001.
There is a third organization which is not a provincial program, but a not-for-profit, independent organization known as Accreditation Canada which has for a long time run a voluntary program of accreditation of all (or nearly all) hospitals in Canada. A truly remarkable job. Recently it has been working under contract in one province to assess its laboratories. Accreditation Canada’s standards have also been accredited by ISQuA.
To have standards accredited they have to be externally assessed and demonstrate the 6 principles of Quality Improvement , Patient/Service User Focus, Organizational Planning and Performance, Safety, Standards Development , and Standards Measurement (reference: www.ISQua.org).
The other provinces either don’t have a provincial accreditation body, or they have one that has not yet taken the step forward for external assessment and recognition.
There clearly are differences between the ILAC process and the ISQuA process, and their strengths and benefits for the clinical and laboratory settings can be debated for a long time. It is similar to the discussion that sees value in differentiating between the accreditation process and the certification process. I’m not going to get into that at this point, largely because I see the argument is all too often driven by bias and competitive commercialism. What is important is undergoing external assessment demands discipline and rigor and demonstrates a commitment to quality improvement
So in Canada, we are fulfilling a process that I have been promoting for near a decade. If Canadian oversight bodies want respect, they need to demonstrate they deserve it, and to do it by independent external assessment.
And so, congratulations to the province of Ontario OLA program, and to the province of British Columbia Diagnostic Accreditation Program and to Accreditation Canada (and to CMPT!) for allowing the external light of audit and assessment to be shone on their programs.
And my heart-felt thanks to my anonymous friend who got me to get the ball rolling.
m
PS:
Please visit
www.POLQMWeekendWorkshop.ca
for an important notice.
www.POLQMWeekendWorkshop.ca
for an important notice.
Saturday, November 27, 2010
Patience and Patients Safety.
I was reading my local newspaper and found a columnist discussing an article on Patient Safety and Hospital Error. We can no longer be surprised that the things we do have become front and center in the cross-hairs of the media. We have brought this on ourselves.
The column reported on an article in a recent edition of the New England Journal of Medicine (how scary is it that a local paper columnist is reading articles in the NEJM!), which looked at the rates of “preventable harms” in 10 hospitals after 5 years of active process (2002 – 2007). While there was a reduction (approximately 1% per 1000 patient days) the reduction was not statistically different, nor (my opinion) clinically significant, even if it was real. This was true for each category from inconvenience to death. This was despite active engagement in national and state patient safety training programs. The analysis was done by thorough and fair methodology with internal and external reviewers and appropriate reviewer comparisons.
The authors commented that harm from medical care remains high despite all the programs and the money being spent.
The study confirms reports from across the US and Europe about how little progress is being made. The amount of evidence-based error reduction practices implemented so far has been modest. All the things that can be done, including substantial improvement on handwashing (!), have not be implemented with any consistency or success.
A few thoughts.
Systemic and personal behavior does not appear to be easy to change. While there are folks within health care who are motivated for change, they are not having a lot of success, when measured by outcome. Most of the money being spent on patient safety teams, training seminars, poster displays, and conferences is being wasted (unless your personal income comes from being on a team or charging for putting on training seminars). Change is not happening.
My concern is that if healthcare management and healthcare unions wanted change, then change would happen. I could suggest that there are other vested interests at play, but that would be unfair. But if neither altruism nor fear of malpractice litigation is insufficient to drive change, then what does it take?
At some point the media will care enough and the public will care enough and the politicians will care enough and we will end up with a healthcare version of the Transport Security Administration and we will have our own version of an intrusive airport pat-down. And folks will say, well it may not be fun, but it makes our lives more secure.
For example, the regulators, and legislators always have lots of options, like perhaps a variation on pay-for-performance. Rather than institutions getting a reimbursement bonus for good deeds, every year that they miss their goals, they lose 1%. And insurance providers can start jacking up institution protection rates.
“They” say that carrots work better than sticks, but sticks work.
From my vantage point, hopefully medical laboratories might be having some more success in error reduction because as much as we are a complex distributed activity, our activities are more focussed than hospital admissions. Some centers have demonstrated some levels of reduced error, at least on a short term basis. But we need to have an institution perform the longer term year-over-year study like the one above to observe if trends can be seen as improving. I suspect that most are afraid to look.
The media already cares, and the public is become more aware.
Strike 2.
m
PS.: The article is “Temporal Trends in Rates of Patient Harm Resulting from Medical Care” by CP Landrigan et al. N Engl J Med 2010;363:2124-34.
Thursday, November 25, 2010
Thanksgiving, Family and Quality
Thanksgiving is a day for family time, and the beginning of the family season, and so it is unfortunate that it is also a time to become aware of discomfort. Today we learned of the death of David Crosby last week.
David (Dave) Crosby was the younger brother of Phillip Crosby, and a quality management expert in his own right. Perhaps not as well known as Phillip, David was a regular author and quality contributor. He worked in the Quality arena for 50 years. He wrote two books, "The Zero Defects Option (How To Get Your People To Do Things Right)" and ""Zero Defects Option". He was a prolific contributor to the Quality Digest (www.qualitydigest.com).
David was committed to the notion that it is possible to create a Zero Tolerance culture. Committed to the notion that zero tolerance is a leadership choice Zero tolerance for defects or error was not the original thought of David Crosby, nor of Phillip Crosby. It was included in the 14 points of W. Edwards Deming, and was a guiding principle in aeronautics and missile development. Regardless of who expressed it first, it is none the less a desirable goal. Now I know and understand the concept of slips and human foibles, but I also know about systemic error that fosters human error.
Zero tolerance for error is a valued goal. As one critic has commented: If by buying the book, it prevents only one defective product or service, you are way ahead of the game. There are no $25.00 errors.
Phillip was 4 years older than David and created his quality consulting company one year before. And his fame rating was probably higher. That being said, both brothers can be found in my library.
For my American colleagues, enjoy a belated Thanksgiving (we Canadians celebrated out a few weeks ago).
m
David (Dave) Crosby was the younger brother of Phillip Crosby, and a quality management expert in his own right. Perhaps not as well known as Phillip, David was a regular author and quality contributor. He worked in the Quality arena for 50 years. He wrote two books, "The Zero Defects Option (How To Get Your People To Do Things Right)" and ""Zero Defects Option". He was a prolific contributor to the Quality Digest (www.qualitydigest.com).
David was committed to the notion that it is possible to create a Zero Tolerance culture. Committed to the notion that zero tolerance is a leadership choice Zero tolerance for defects or error was not the original thought of David Crosby, nor of Phillip Crosby. It was included in the 14 points of W. Edwards Deming, and was a guiding principle in aeronautics and missile development. Regardless of who expressed it first, it is none the less a desirable goal. Now I know and understand the concept of slips and human foibles, but I also know about systemic error that fosters human error.
Zero tolerance for error is a valued goal. As one critic has commented: If by buying the book, it prevents only one defective product or service, you are way ahead of the game. There are no $25.00 errors.
Phillip was 4 years older than David and created his quality consulting company one year before. And his fame rating was probably higher. That being said, both brothers can be found in my library.
For my American colleagues, enjoy a belated Thanksgiving (we Canadians celebrated out a few weeks ago).
m
Tuesday, November 23, 2010
Heresy?
American Society for Quality has a number of journals, and some excellent, and others pretty good. One of the latter category, at least in my opinion, is the Journal for Quality and Participation. Without wanting to be harsh, I tend to find articles, while generally interesting, tending to be more opinion than fact. (The irony of me, an opinion oriented blog writer making this distinction is not lost!).
Nonetheless there were three interesting articles in the October 2010 edition that I received today. One was “Improving Project Performance with Three Essential Pieces of Information” by Portnoy, and another “Creating a Self-Confident Workforce” by Denton. The last one is "Training on Trial” by Kirkpatrick and Kirkpatrick.
I don’t intend to go into any (the journal is available at www.asq.org/pub/jap) in detail other than to say that the first article made the point that when designing a project briefing (the deliverable) it is useful to make it brief and unambiguous. The point was that the more the number of words and the more jargon included, the more variable is the document's interpretation. The training article was making the point that training can have a lot of challenges demonstrating that it actually provides a service that will address significant business results. And the third article made the point that workers are more self-confident with less stress when they are empowered to make certain decisions on their own.
I can support all those points.
Which brings me to the heresy.
At a laboratory where I was working, I often found myself in conversations that suggested that some of the standard operating procedures (SOPs) were so detailed and so “precise” that in my opinion, they were largely unfollowable. Even with an adjoining process map they were unfollowable. They were better when pictures were added in, but especially better when pictures were used in place of words. In the process of trying to make SOPs that were all encompassing, I felt we were laying the foundations for error.
And so I started to think that maybe it is not so important to tell microbiology technologists how they have to hold and streak a petri dish, but to let that happen on its own. And defining precise colony counting methods was so rigid that it likely wasn’t followed anyways. And trying to define all the combinations and permutations of bacterial growth was confusing.
Now there are many procedures involving many pieces of highly precise equipment that do need precise instruction (I understand that) but if we pollute those instructions that are challenging to follow, I think we run the risk of some documents that need attention and clarity getting lost in the shuffle.
So my point is that SOP writers and trainers and supervisors need to take a closer look at the procedures that they create to make sure that they actually are useful for training, and more importantly allow the business purpose of the procedures to come through. Its not only about the value stream, but also creating documents that give technologists the professional autonomy. Smaller and selective documents, it seems to me, make the procedure and much of the decision making process both more efficient and more effective.
And how heretical is that?
m
Monday, November 22, 2010
Getting the story right?
I have a long abiding interest in things Quality, and especially for the concept of continual quality improvement. (We may not be perfect today, but we can be better tomorrow). From my perspective the single most important concept that underlies continual improvement is the cycle of planning an activity – doing the activity – checking to see if what you thought was done, was in fact done – and then acting to adjust the outcome. And then do it again and again. To me that is the Plan-Do-Check-Act cycle which I know and understand was adapted from Walter Shewhart, and created by W. Edwards Deming. Later in life, Deming modified the terminology to Plan-Do-Study-Act (PDSA), but PDCA is the cornerstone of Quality, it is the cornerstone of ISO9000, and it was created by Deming. So imagine my interest and surprise when I read that I was wrong.
In this month’s (November 2010) edition of Quality Progress (QP) published by the American Society for Quality is a fascinating article entitled “Clearing up the myths about the Deming Cycle and seeing how it keeps evolving”. It is written by Ronald Moen and Clifford Norman. Apparently Moen worked with Deming in the early eighties, and includes within his references for this article a letter written to him in 1990 by Deming himself.
According to this article, while the first part of what I understood was correct, that Deming adapted the cycle first developed by Shewhart in 1939 to a new version, in 1950, that was not the Plan-Do-Check-Act cycle. Not only that, but while Deming spoke of a PDSA cycle for many years, he did publish it as the Plan-Do-Study-Act cycle until just before his death in 1993. Having checked my highly valued reference "The Deming Management Method" written by Mary Walton with W. Edwards Deming in 1986, at least part of what the article says is confirmed.
Without going into further details, I recommend this article as essential reading, especially for those who are interested in the history of Quality (www.qualityprogress.com).
It is fair to say that in the Quality Arena, Deming was (is) a larger than life character. Indeed in the same edition of QP, the lead article is called the "Guru Guide: Six thought leaders who changed the quality world forever" (Shewhart, Deming, Juran, Crosby, Ishikawa and Feigenbaum). At a time when the world industry needed order and structure and change, Deming was there. His books and presentations and 14 Points and 7 Deadly Sins and the famed Red Bead Experiment are the very foundation of Quality, and as relevant today as they were when they were written.
So I might be able (in time) to accept Moen’s and Norman’s thesis that Deming didn’t establish the PDCA but it does not change my view of the man nor his role in history or indeed the present or future.
Just as long as no one interferes with my beliefs in Robin Hood, Sherlock Holmes, or Santa Claus.
M
PS: The author of the Guru Guide article, who is an assistant editor for the journal, provides a different history than does Moen and Norman. Go figure.
Subscribe to:
Posts (Atom)