Skip to content

Transcription: GMP Data Integrity Expert Panel Discussion – Live Free Webcast

The following transcription is taken from The Windshire Group’s webcast GMP Data Integrity Expert Panel Discussion”:

Terri Melvin: Welcome everyone to today’s webinar, GMP Data Integrity Expert Panel Discussion. I’m Terri Melvin, and I will be today’s moderator. We have with us today three expert panelists. Panelists will first tell us a little bit about themselves and then address some talking points in their particular areas of expertise. Then we dive into your questions live. I will now turn it over to our first panelist, Dr. James Blackwell.

Dr. James Blackwell: Thanks Terri, Thanks to everyone in the audience for joining us. I am Dr. James Blackwell President, Principal Consultant for The Windshire Group. Our hope today is to make new industry friends and acquaintances and help organizations with some of the most pressing questions around this very important and timely topic. Just a couple house rules before we get started, please don’t ask anything that contains confidential information. We don’t want it and we don’t market in it. As for today’s webinar we will not be offering any specific advice for anyone’s particular situation and we’re not responsible in any way for any use of the advice or opinions expressed. After self-introductions, we will share a few of our personal thoughts about Data Integrity and then will get started with questions. I will then close our expert panel webinar with some concluding thoughts. Prior to becoming a consultant, I held technical positions, within the industry’s development and product companies. I’ve been an industry consultant for more than 12 years, assisting with GMP technical aspects for products in all major therapeutic classes in all stages of the product life cycle. I spent a significant amount of time over last six years working on the quality side of things, including consent decrees, third-party oversight, and organizational remediation experience, including extensive work on data integrity issues. All of our panelists have significant experience working in these areas as well, and I would like to now introduce Danielle DeLucy for her self introduction.

Danielle DeLucy: Hello everyone, and welcome to our webinar today. My name is Danielle DeLucy, I have been in the industry for about 16 years now. I’ve worked primarily in the microbiology industry as a beginning microbiologist for a contract laboratory and then, quickly became very interested in quality systems and quality assurance. I worked with many biologic and pharmaceutical companies, finding out a lot about all these systems in quality assurance and along the way really focused a lot on data integrity. Believe it or not, there were a lot of issues, that reared their ugly heads in some of the situations and deviations that I have worked with from a data integrity perspective. Then as I began getting a little bit more interested in the systems, I started to do some consulting to help some companies with setting up quality systems. Now I am the owner and operator of a quality consulting company. I also do consulting with The Windshire Group. That’s a little bit about me and I will hand you over to Al, so that we can hear a little bit about him.

Alfonso C Fuller: Thank you Danielle. Hello everybody, I’ve been concentrating in FDA regulated industry for about 20 years, working in software quality, software validation and software related issues, probably about 40 years. Most of the perspective that you hear from me today, will be on the computer related side. But like everybody said, today we will talk some more in the next hour about this issue that will go right across most of your business. The point of view I want to watch out for is where you are increasingly automating and computerizing various applications within the business. All that said, why don’t we dive right into the discussion and the questions.

Terri Melvin: Dr. Blackwell we are ready for your overview.

Dr. Blackwell: Thanks, Terri. I see data integrity is really thought of it as the glue that holds the cGMP’s in place. And while the term data integrity isn’t specifically mentioned in the predicate rules, you will find the concept of data integrity are replete throughout the predicate rules. Obviously, Part 11 is a very important aspect of data integrity and there is extensive guidance out there around electronic data and signatures which impact the computer systems Al was speaking to. I think the question comes up why does this seem to be more of an issue recently? I did a Google search on the terms Data integrity versus GMP as a control if you will. The interest in data integrity has increased two-fold in the past five years, so, this is a growing interest. It has started to feed upon itself, in terms of why you are hearing others talking about it. And it’s also no surprise that the data to support this, observations and Warning Letters, were up sharply that involve issues of data integrity. I think part of this is sensitization on the part of regulators who now are trained and there are now forensic trained inspectors who actually look for these issues. I think this is also related to the fact that supply chains within the industry have become more complex and more outsourced. You now have organizations that are relying on other organizations, and so there is an inherent lack of control that those organizations now have over those other supporting organizations. There is now more manufacturing outside the USA. You now also have more companies involved in the supply chain that haven’t historically been as mature in terms of the quality systems and there certainly, is probably a cultural dimension aspect of this as well. And on top of this, we have increasing profit pressures within the industry, and these may be more so for companies selling overseas.

And so, for example, right now in China, there is a huge data integrity issue associated with data around clinical trials and is basically implicated all data trails that have been conducted in China and the Chinese FDA is taking that situation very seriously. In terms of individual, organizations, the responsibility for data integrity starts with management and they need to create a culture of compliance. But all other responsibilities doesn’t fall on management. Personal responsibility is very important. Its very important because in the individual act of falsification, puts the whole organization or can put the whole organization under a cloud. And I have personally seen this happen where falsification occurred by an individual during an FDA inspection and the FDA inspector caught onto this immediately. This is a person in a responsible position. This immediately put the whole organization and the culture of that organization under a cloud. And so that had huge ramifications for that organization. Those are my introductory thoughts. I would like to turn it over to Danielle.

Danielle: Thank you, James. As you heard in my beginning points, I actually have a lot of laboratory experience. I like to call myself a lab rat at heart. I’ve seen, a lot of citations over my time in industry pertaining to data integrity and much like what of Dr. Blackwell has said, there’s been a lot of security infraction citations and Warning Letters that have come about quite frequently of late pertaining to data integrity issues. One thing you really have to look at when you are looking at the laboratory data is the integrity of your data, making sure that you instill in your laboratory technicians and your managers and supervisors the importance of integrity, and making sure they do the right thing. And let’s face it, a lot of times laboratory technicians are on the bench by themselves, kind of unsupervised making sure that they process the samples quite quickly so that those results get to Quality, so they can approve them and ultimately the product will go out for release. Along the way, if they do see an issue, we have to make sure that they are able to kind of flag that issue and make sure that that data stays integral. Make sure that the laboratory doesn’t ruin their reputation. If there is any data integrity issue as Dr. Blackwell just stated, it puts quite a cloud along the laboratory lines. You see a lot of FDA Warning Letters pertaining to data integrity issues. Again, I’ve seen firsthand things happen during an inspection, right in front of the investigator and sometimes it’s lack of training. It’s not purposeful sometimes people just don’t understand. It’s really kind of having an ignorance type of a thing. There are also different definitions of data integrity and you’ll see sometimes maybe one laboratory or one industry may think of data integrity a little bit differently. So that could be a concern and a lot of times it’s always helpful to have a site policy and procedure on data integrity, so that everybody is on the same page when it comes to this really important quality system. That’s a little bit about some points I’d like to consider. And I will hand it over to Al for his points.

Al: Thank you, Danielle. I’d like to kick off this part of the discussion with some facts about data integrity from the perspective of those of you who are the computer user of automated systems, or are instrumental and bringing them online, qualifying, validating or maintaining them. Because the data in the computer systems is basically invisible, you do want to make sure that you focus on, like Danielle said, having practices and policies in place ahead of time, so that your organization does not get caught in a corporate audit or investigation from FDA or some other health authority. Let’s start with planning, making sure that people understand the basics of data integrity around the organization and I would recommend that you have somebody whose job it is to pay attention to this thing. You could be called a data integrity officer or a site data integrity lead. Those are the two most common titles. And they would have defined job duties, responsibilities and authorities around data integrity, included performing data integrity reviews or walk-throughs on a formal basis with documentation and follow-up of any observations. You should ensure that you have a complete inventory of any systems that are capable of generating data, you can probably leverage or augment some existing inventories that you have. And then look at those inventories or systems to determine whether they appear to consistently deliver data that has the requisite data integrity that can be relied upon, in ways we are talking about today. Finally, as part of your planning process, I would have advocated that you have a data integrity plan that integrates with all the other high-level plans that you have at your facility that includes the people, the systems, the processes that tie them together in a way that your personnel can understand what kind of expectations your organization has for them around data integrity. Terri…

Terri: Thank you, panelists. We will now open it up for your questions Simply type your questions into the question box. Now for our first question.

Question: What are the areas of greatest risk for data integrity issues?

Danielle: I can speak to that. Much like I said in the beginning of talking points – the laboratory, obviously is a huge generator of data, and many times, that is one area that will come about as an area that you really need to focus a lot of data integrity training. Kids coming out of college get their first job in the laboratory, if you’re a biology or chem major. They don’t really know the ropes, if you will, on how the GMPs are structured or how our data integrity should be handled. And so, again, I really want to say that you should focus on training on the data integrity related issues to getting those policies and procedures in place, if you don’t have any. A lot of the times things coming out of the laboratory could be not recording your data in a timely fashion and maybe recording it before you actually get results, because sometimes there’s pressure to release product. I’ve seen so many different examples of people really just being under pressure to release products that they really feel like they’re doing the business justice. By kind of putting a result down on a sheet of paper and handing it over to Quality they don’t realize the impact that they’re going to have for actually going against any type of data integrity policy that the company may have and also that the FDA expects data integrity to be kind of at forefront of the laboratory. So, I would say, the lab will be the first area that you would see some data integrity issues coming out of. I’m sure the other panelist’s have additional examples that they can provide as well.

Al: Sure, I can add a little bit. Keep in mind before I get into the details, as somebody said earlier that today’s FDA investigators are far more highly trained to anticipate and look for data integrity issues than they were, say five years ago or ten years ago. In the computer area, they are also all trained now in computer validation, at least the basics. So, everybody in the FDA field enforcement staff has been sent back for training and it’s no longer a matter of this particular person having an interest in this area.

FDA management is expecting them to look at your organization from a data perspective. So, with that background I would say that there are two things in particular that come to mind, that FDA and other health authorities are focusing on right now. One is audit trail reviews and the other is individualized defined user access. So, what do we mean by that? Audit trails reviews means that they expect you to have documentation to show that, where you have used computerized systems in GXP operations that you have regular reviews of the audit trails on those systems. At least at a technical level not necessarily looking at all of the data there, but at least at the technical level to assure that your audit trails are operational and that they are turned on and that they have not been modified in the intermediate period so that you have confidence in your audit trails. That also means that, by definition, the system has to have audit trails. They need to be automatic and time stamped and computer generated and all other attributes that a log file would have so that you can call it quote “audit trails.” You have to have the audit trail, it has to be turned on, it has to be automatic, it has to be working and then periodically you review that to ensure that functionality is there. This is kind of a back-stop for your basic data integrity practices. And the well-structured and well-operated audit trail is going to tell you who did what, when and why. And so, if you can demonstrate that, you’ve got that kind of background data or audit trail meta data, about your primary data, then that provides a lot of assurance that the primary data is intact. And so, your basic primary data integrity is there by virtue of the evidence provided by the audit trails.

The second thing that a lot of folks are looking at very closely these days is individualized defined user access. So, basically, individual user usernames and passwords for access to equipment and computers. This is mostly at the equipment level to the manufacturing level. Most higher-end computers like desktops, for instance, you automatically will log in with your assigned username and password. Some way to authenticate that some particular individual is using this machine and actions taken on the machine are attributable, and that’s really the “attribution” that actions taken on the machine are attributable to this person as well as keeping your record when and what happened. What happens now with a lot of manufacturing equipment is that they were really never setup for this kind of individualized access and so a lot of you have equipment where, the equipment has basically three usernames program into it, like user, supervisor, and admin. You know the machines I’m talking about. And then all the users have the same password and the admin and manager may have a different password, but they all have the same password. The same thing for the mechanics and people who work on the machine, where it’s a shared password situation. So, there is no record of what individual performed, what activity on that equipment, and that kind of operation is no longer acceptable to FDA and other health authorities. And so, if you have equipment like that, then you probably want to start planning to migrate those individual pieces of equipment onto some kind of a network segment within your organization. You may have a secure manufacturing network segment or secure lab and get the same sorts of things in laboratories. You may have a secure lab network segment where you want to start putting those machines. For the manufacturing machines, some of them have the native intelligence to do it and hardware the networking connection and all you have to do is connect it up to the network, establishing SOPs and processes around controlling this. Others literally are old enough or simple enough that they literally don’t have the hardware you may want to or need to upgrade your PLC to connect them up to the network. Though once you do, I think you going to get much better control of the machine, you can deliver the recipes directly to the machine and reduce operator error, you can probably make it more efficient to operate the machine all on top of the fact that you get much better control of your data and your data integrity. So, it’s not a costless solution but there are gains to be had in addition to just compliance. So, probably some pretty good operational gains to be held in those cases. So, I would say, those are the two major things to look out for right now.

Dr. Blackwell: My perspective on this is that really look at your controls and your critical points around your release decisions for products. So, when FDA talks about your critical controls, they are not talking about critical control parameters. They are talking about data that goes into the decision that you used to release a product. So, you can do data mapping to look at where the risk points are. And some examples are those legacy lab instruments that are fully Part 11 complaint. You now have a hybrid what is known as a hybrid system between paper systems and the lab system that creates some real risk points system where people may change metadata or parameters and it’s not been caught, so I recommend having detailed procedures requiring people to document everything within a log, book, or form, controlled form in those cases.

Another example is filtered integrity testers. A lot of those legacy systems don’t meet today’s standards and need to be replaced. With that I’d like to take a different tact. I think one of biggest risks is if there is serious data integrity issues, some organizations may want to take this skeleton and try to bury in the backyard. But you really need to be proactive about these things and involve the Agency at the appropriate time and inform them. Some of that is required by law. But I think being proactive is the best advice and so if you look at FDA guidance on this, let’s say you have a Warning Letter or observations, they actually recommend some of their guidance from 1991 Points to Consider for Internal Reviews and Corrective Actions, so, that entails bringing in a third party or a consultant to assist with those situations. But what they don’t speak to is business practices so, I would say, what mature organizations do, is they recognize serious issues. They should be proactive and address the issue by bringing in third parties to assist with remediation of that issue before it becomes a bigger problem.

Terri: Okay great! Since we are on risk I’ve got a great follow-up question about risk. Is there risk of data integrity issues with product complaint handling systems, consumer relations or customer service databases that captures product issues?

Dr. Blackwell: There certainly are. This has been, if you go back and look at observations that the FDA has made, it actually has been one of the areas that have been a focal point. And so, it’s a real challenge because you have complaints coming from various sources there’s a lot of third parties involved, so, really you need a quality management system functionality that handles and manages all that for you. If you are just relying on paper records and emails this is a situation that can quickly get out of hand and the foundation of this and other data integrity issues is making sure that you’re doing timely and appropriate investigations around those issues. But those are some of the challenges that I see, just managing that data flow that’s coming in, and documenting your actions and the fact you reacted to those things in a timely manner.

Danielle: I also can speak first hand from some of the situations that I have been in. Dr. Blackwell is certainly correct in making sure that you do react timely in these situations. Many times, in situations, I’ve been involved with data integrity issues, or just basic deviations that come up, you really have to make sure that you’re not only reacting quickly, but your justification is sound and robust. That is one thing that FDA investigators and many regulators come in and they may not fight you on a lengthy timeline for your investigation, maybe it’s open past 30 or 40 days, but will certainly cite you if they feel like you haven’t done your due diligence with your investigations. So, making sure that the justification is there, especially if there’s product release decisions included, they will certainly look at that as well.

Terri: AL, do you have anything to add?

Al: A little bit. Some larger organizations will use software to manage the complaint process you have a pharmaco vigilance safety system something equivalent to that, so, with systems like that, you do have a couple of opportunities for data integrity challenges. Basically, the front end and the back end. On the front end is information coming in, you do want to ensure that you capture it accurately and in a timely manner. And the organization has an interest in ensuring that all that information is complete and correct. There may be sub units that don’t necessarily share the organization’s goal of gaining all that information completely documented, because they have operational concerns about, controlling those numbers. So, you do want to make sure that the controls that Dr. Blackwell has been talking about are in place in that system. That system will typically also get tied to your reporting system so, everywhere where you sell product, the health authority there is going to have rules about when you need to notify them that you have particular kinds of complaints. Obviously the most serious ones they need to be notified within 24 hours or one business day. There are other things that get stretched out to the point where it may only annual reporting. And then you have a situation where you may be selling into a market where it has overlapping, jurisdictional control, I think most of you are aware. You could be selling into the country where it has its own use controls, EU controls. And then you are in a group of 3 or 4 countries. Using a computer system is a wonderful way to help manage that. But you do want to make sure that the information coming in, you have all the controls and the data integrity there because what goes out and you are reporting to health authorities that you are responsible to is only as good as the information that’s in the system. So, as you maintain the data integrity within the system you help yourself right across the board.

Terri: Thank you. Next question: What is the recommended time frame for an audit trial review?

Al:. There is no recommended timeframe from FDA or any other health authority. It probably depends on the system, nature of the system, the frequency and volume of data going into it. The FDA and other health authorities do recognize that a risk-based approach is appropriate, and so, what you probably want to do is, for various systems or kinds of systems do a risk assessment, document for some basis for the period that you wind up setting for the review. All that said, I would think that a review that was longer than let’s say 18 months or two years is probably not going to withstand scrutiny with the Agency. And that would be for a period that long will be a system that is relatively low risk, probably relatively low volumes of data. When you get to systems that have higher velocities of data and the data is more critical, then the expectation is that the reviews are going to be more regular. I do have one client that I am working with right now, who is installing a new packaging line so that they have serialization. You can individualize serial numbers on down at the bottle level. And so, it is comprised of three or four different software packages put together for the labeler and optical systems and everything. And, we put in place an SOP to review the audit trails on that system and it’s going to be monthly. So, we are going to take random samples of a number of batches and review the audit trails from those batches on a monthly basis. And that is a balance. Obviously you’re not looking at every batch, you’re not looking at every run, but we certainly are not letting more than a month go by, where we are looking at stuff. That’s the kind of approach that you want to take, that is, to analyze the risk of the system, definitely document your decision, ahead of time. Do not get caught out in a situation where somebody comes to inspect you and they challenge your approach and you have no documented basis for it, and it feels like you made this out of thin air. If they don’t like your approach, which is at least the documented rationale, then you are in much, much, better territory. Danielle, Dr. Blackwell anything to add on that one?

Danielle: Just a quick note, a lot of times what I would see with some of my clients, and also working in industry is that we would give this task of audit trail review to some of our quality assurance internal auditors. If we did some spot checks along the way, maybe every few months or so, that really was a great thing to do because, it kind was an eye-opening experience to see some of the different things that may have been going wrong, or things that we needed to look at and we were able to address them before any of the regulators came in. That’s also an option. I always do like to get those internal auditors involved to kind of do some small checks or surprised small checks on some of the different critical systems that exist in our quality system management arena.

Dr. Blackwell: Yes, I would add that, when you are dealing with critical data, I think that metadata audit trail should be reviewed for anything that directly involves the product testing and release of the product. For example, an HPLC unit that is doing an analysis would generate an audit trail around that test as part of the release of the results, from that instrument. The audit trails should be reviewed as part of product release. So, again, its understanding as it speaks to what Al said. In terms of the risk-based approach, I think any risk analysis would tell you that, your audit trail meta data around product release testing, particularly in the QC lab, should be audited as part of product release.

Terri: I have a very specific question here: What is expected with the window event logs when saving data locally and then transferring the data to a secured network? AL, I think that’s for you…

Al: I don’t know if there is any particular expectation around the Windows event logs. FDA is looking at the data. You want to make sure that the process that we are discussing is validated. I’m assuming that we’re talking about the GXP operation in the organization, and not auto mileage records or something. If you are using a computer system not governed by FDA or any other health authority, then it needs to be validated. We’re in the GXP realm. And this would be part of the validated system, where you identify the data that is being generated at the source, and then you look at the data at the point where it lands and you verify that the system is transporting data intact including the metadata, the data, about the data, around it. So, the audit trails and the metadata and the primary data all wind up together in the same format, in the same form with the same usability at the end. If that is happening and you have evidence that it’s happening, then you should be okay. Nobody is going to expect to look at a Windows event log for intermediate actions around that operation. Now, that said, if you are having problems, then all bets are off, it’s not really a validated system because you’ve not verified and validated that it has user requirements and its meeting its expectations consistently. So, I suspect that if any of you have a question like this where the Windows event viewer, Windows event log is coming into play, then it’s probably because things are not going the way that they are supposed to be going. And in that case, you can use the Windows event tools to diagnose your problems, but keep in mind that, that’s really a symptom and it’s not your core issue, right? So, keep your eye on the ball where it needs to be. I am dealing with this right now on the lab instrument, which there is a particle size instrument where they are trying to upgrade the software, and they can’t make new methods on it because there is some incompatibility between the application software and operating system and the validation engineers, elbows deep in the Windows event log. Okay, that’s nice and when you figure it out, back up to the high level and say it supposed to do ABC and D, you know, is it now doing ABC and D and do we have evidence that it does that consistently. So, to put a bow on that part of the comment, it’s a valuable tool but it’s really not something that FDA is going to focus on. They are going to focus on the high-level steps; is your data intact.

Terri: Danielle, James, and anything to add?

 Dr. Blackwell: Not for me.

Terri: Next Question: Any thoughts on using Excel spreadsheets as tracking tools for things like training, issuance of deviations, etc.?

Dr. Blackwell: This comes up a lot. It just came up with one of my clients. Again, I will use a risk-based approach, but again, using that approach, there are tools that are available that you can validate these systems within the Excel environment, there is a lot of capabilities there that can support an audit trail, where, if you are using this again to run these critical decision-making processes again, you definitely should validate those spreadsheets, and they can be validated, so that they are fit for purpose. Where it starts to get a little murky, for example, I’ve seen these used, for example, to document, let’s say maintenance calls, or issues around things that are happening in the facility, and people go and use these to log events, then I recommend having procedures that control the Excel spreadsheet and make sure that any entries cannot be changed later, so that may entail locking down that spreadsheet and putting it, in essence, under control of probably your IT group, or something like that. It is a manageable issue, but it’s a good question and it’s a very common one.

Terri: Danielle, any ideas comments about utilizing Excel?

Danielle: I’d just like to add that, I have had experience with trying to defend something like this with some regulators. In my former life as a QA auditor, we had a spreadsheet we were tracking some of our internal audits instead of using a software system, granted the FDA really isn’t privy to seeing the actual audit but they can certainly see that we have been doing them enough and that was our proof to show them that we were actually performing them according to procedure. And, we were questioned quite a bit on how we were validating that spreadsheet and who had access to the spreadsheet, and they were none too happy with us using an actual Excel spreadsheet to track that information. And as a response, we did move it into more of a software, more validatable system compliant with Part 11 and it did work out better for us as a company. Just be cautious of using those spreadsheets, as Dr. Blackwell did say. Make sure that they are able to be locked down, with just some definite locked down user access who can change the data. We also had one that tracked one of our assay potency results, so, I just want to make sure that, you all know that if you do track any type of results, or are doing any calculations in there, and that it is locked down and the data certainly cannot be changed. So, it is kind of a slippery slope, and whenever I do consult and my client uses Excel spreadsheets, I do try to push them in the other direction that tries to use a more validable software program.

Al: I would concur with what we’ve heard so far. I suspect the reason we have a question is because, the member of the audience have some doubts about the appropriateness of using these Excel spreadsheets in GXP operations, and as you can tell from our comments that this is probably a well-placed concern. If you use Excel spreadsheets, it’s a fairly complex subject, but what it boils downs to is that while you can validate that the spreadsheet does exactly what it’s supposed to do, it is a struggle to secure the spreadsheet in a way that is going to satisfy FDA or the health authorities. Excel is a wonderful desktop tool but it was really not designed for this level of use, and audit trails in particular are a challenge for Excel spreadsheets. My advice is, if you are going to use Excel spreadsheets in a GXP operation, do a couple of things. One is, definitely validate it, so you have evidence that it actually does what you say it does, so that there are user requirements and functional specs and all of that for any software, and then test it. As Dr. Blackwell was saying, then you lock it down so it gets put on a network share and IT controls access to it, so, people can run it, but you are not using it as a database. You are basically using the calculation tools for the spreadsheet. And then the output gets stored someplace else. So, you do want to validate it, but you also want to consider this to be a temporary or interim solution and then find a way to do this regulated work, this GXP work, with a tool more appropriate to the work as Danielle was describing. One way would be to take this to IT and say I need an application for this. And they can probably do it in-house. Sometimes you have to send this out, but, especially if your spreadsheet already defined the user requirements and functional requirement for this and you already got the spreadsheet working. They can see what you’re trying to do. They can see the inputs and the kind of outputs that you need, then they can whip up a small application in relatively short order. This is not a huge effort but they can put together the actual software that will do the same work, but it will be validated and it will be compliant to Part 11, it will have audit trails, it will be secure, it will meet all of your needs and then you can retire the spreadsheet. So that advice would be, consider the spreadsheet to be an interim tool while you get to something that is sustainable in the long run.

Terri: I have a really good question here. The cost of upgrading or replacing systems regardless of their age is rather expensive in our industry. Do you have any recommendations regarding remediating the conundrum of users, supervisors, or admins shared passwords and what about system-used logs?

Al: We do want to get rid of shared passwords as we were discussing earlier and that may be a primary motivator for upgrading the system and then you look at the cost of the software and the downtime, you might need a new PLC or a new computer because the old PLC didn’t have the ethernet ports that can connect to the authentication server where you keep your usernames and passwords stored, or if it actually had a small computer in it, it’s running Windows XP and that’s no longer supported, even an older version of something like that on some operating systems. So, when you start this it can easily can cascade from a fairly, small well-defined request we need individual usernames and passwords to something much larger like, oh well, we need a new version of software, and we need a new computer. It needs to be validated and there is going to be some downtime for the machines, so that there is some impact on the production etc. And, I’d love to tell you that I have a magic wand over here and I can make all of that go away. Unfortunately, I don’t have that magic wand. There are ways that we can streamline the process, and planning helps to remediate a lot of the effects of that. But basically, it’s just time to upgrade some of that stuff. You know if you’ve got equipment that’s running on Windows CE or Windows XP or some of these really old PLCs that just aren’t connected and just don’t have the functionality that you need today, you’re kind of living on borrowed time and you do want to upgrade. Partly because you want to satisfy FDA and other health ministries, but also keep in mind what happened three weeks ago, the WannaCry issue where it’s very easy for even if you have very great border security, great firewalls and routers. All it takes is for one tech to come into your facility for legitimate purpose with a thumb drive that got infected someplace and they plug it into a computer. So, this is not somebody doing something nefarious and if those computers are old and unpatched, then they are going to be vulnerable to some infections spreading like wildfire inside of your organization. And firewall borders and routers are not going to prevent that. But keeping current with your operating systems and your batch level will. You do have additional gains to be had by upgrading the systems because you do have threats today that you didn’t have, say recently, like 5 years ago or 10 years ago.

Dr.Blackwell: Terri, what I would add is a real challenge for organizations is that they’ll have data in those old legacy systems; that data still needs to be accessible. So, the organizations need to really think through that, which may mean keeping those, as Al has said, that legacy operating systems in moth balls, still being able to make those accessible during an inspection. Part of the challenge there might be addressable with what’s known as data migration; again that requires validated procedures and getting the data into another place and form that meets the ALCOA plus principles.

Al: Absolutely.

Terri: Danielle, anything to add?

Danielle: No, I think they have covered it well.

Terri: Excellent. Thank you all for your expert answers. I am now going to turn it over to Dr. Blackwell for some closing remarks.

Dr. Blackwell: Thanks everyone for attending and we hope you came away with useful insights and a greater understanding of data integrity. If anyone has any specific consulting needs in this area or another topic, we would like to know how we can help. Our contact information is on this slide. We would love to hear from you The Windshire Group looks forward to sharing our expertise with you in the future. Bye, everyone.

Terri: Good-bye everyone, I’m concluding the webinar now. Have a great day.

Transcription: GMP Data Integrity Expert Panel Discussion - Live Free Webcast

 

 

For a free download of the entire data integrity webcast recording –  click here

Read our recent blog post “What are the areas of greatest risk for data integrity issues?” – click here

Back To Top