DEPARTMENT OF HEALTH AND HUMAN SERVICES

FOOD AND DRUG ADMINISTRATION

CENTER FOR DRUG EVALUATION AND RESEARCH

 

 

 

 

 

 

 

 

 

 

PROCESS ANALYTICAL

TECHNOLOGIES (PAT) SUBCOMMITTEE OF THE

ADVISORY COMMITTEE FOR PHARMACEUTICAL SCIENCE

 

VOLUME I

 

 

 

 

 

 

 

 

 

 

Wednesday, June 12, 2002

8:30 a.m.

 

 

 

 

 

 

Hilton/Gaithersburg

620 Perry Parkway

Gaithersburg, Maryland

P A R T I C I P A N T S

FDA Staff

Kathleen Reedy, RDH, MS, Executive Secretary (acting)

Ajaz Hussain, Ph.D.

Committee Members:

Thomas Layloff, Ph.D., Acting Chair

Gloria L. Anderson, Ph.D.

Judy P. Boehlert, Ph.D.

Arthur H. Kibbe, Ph.D.

SGE Consultants:

Melvin V. Koch, Ph.D.

Robert A. Lodder, Ph.D.

G.K. Raju, Ph.D.

Guests/Speakers Participants:

Eva M. Sevick-Muraca, Ph.D.

Leon Lachman, Ph.D.

Emil Walter Ciurczak, Ph.D.

Kenneth R. Morris, Ph.D.

Howard Mark, Ph.D.

Thomas Hale

Industry Guests/Participants:

Efraim Shek, Ph.D Ph.D.

Ronald W. Miller, Ph.D.

David Richard Rudd, Ph.D

Rick E. Cooley

Colin Walters

Doug Dean, Ph.D.

John G. Shabushnig, Ph.D.

Jerome Workman, Jr., M.A., Ph.D., FAIC CChem, FRSC

Jozef H. M. T. Timmermans, Ph.D.

Robert S. Chisholm

John C. James, Ph.D.

Jeffrey Blumenstein, Ph.D.

Dhiren N. Shah, Ph.D.

Henry Avalllone, B.Sc.

Open Public Hearing Speakers

Justin O. Neway, Ph.D.

Li Peckan

Allan Wilson

Dan Klevisha

Tom Tague

John Goode

C O N T E N T S

AGENDA ITEM PAGE

Call to Order - Thomas Layloff, Ph.D. 4

Meeting Statement - Kathleen Reedy 4

Introduction Overview, Training and Objectives of Subcommittee - Ajaz Hussain, Ph.D., Deputy Director, Office of Pharmaceutical Science 10

Subcommittee Discussion on Training 36

Regulatory Challenges

- PAT Applications in NDAs - Jeffrey Blumenstein,

Ph.D., Pfizer Inc. 37

- PAT Application in Post-Approval - Dhiren N.

Shah, Ph.D., Aventis Pharmaceuticals 48

- PAT Validation/GMP Issues - Henry Avallone,

B.Sc., Johnson & Johnson 61

Subcommittee Discussion on Regulatory Risk 79

Open Public Hearing:

Subcommittee Discussion on Regulatory Risks and Training 197

Working Group 238

Working Group Sessions and Strategy

Adjourn 305

 

P R O C E E D I N G S

DR. LAYLOFF: Okay. Kathleen told me it's time to get started, and you know how Kathleen is. First of all, I'd like to welcome you all to our second meeting of the Process Analytical Technologies Subcommittee. It's a pleasure to be here with you all to talk about new and exciting toys for big boys--new technologies, one of my favorites. And before we get started, Kathleen's going to read to us the Meeting Statement.

MS. REEDY: Acknowledgment related to general matters waivers for the Process Analytical Technologies Subcommittee of the Advisory Committee for Pharmaceutical Science, June 12, 2002.

The following announcement addresses the issue of conflict of interest with respect to this meeting and is made a part of the record to preclude even the appearance of such at this meeting.

The Food and Drug Administration has prepared general matters waivers for the following special Government employees which permits them to participate in today's discussions: Dr. Judy Boehlert and Dr. Melvin Koch.

A copy of the waiver statements may be obtained by submitting a written request to the agency's Freedom of Information Office, Rom 12A-30 of the Parklawn Building.

The topics of today's meeting are issues of broad applicability. Unlike issues before a committee in which a particular product is discussed, issues of broader applicability involve many industrial sponsors and academic institutions.

The committee members have been screened for their financial interests as they may apply to the general topics at hand. Because general topics impact so many institutions, it is not prudent to recite all potential conflicts of interest as they apply to each member.

FDA acknowledges that there may be potential conflicts of interest, but because of the general nature of the discussion before the committee, these potential conflicts are mitigated.

We would also like to note for the record that Dr. Efraim Shek, of Abbott Laboratories, is participating in this meeting as an industry representative, acting on behalf of regulated industry. As such, he has not been screened for any conflicts of interest.

With respect to FDA's invited guests, there are reported interests that we believe should be made public to allow the participants to objectively evaluate their comments.

Dr. Leon Lachman is president of Lachman Consultants Services, Incorporated, a firm which provides consulting services to pharmaceutical and allied industries.

Dr. Howard Mark serves as a consultant for Purdue Pharma Incorporated.

Dr. Kenneth Morris serves as a consultant, speaker, researcher, and has contracts and grants from multiple pharmaceutical companies.

In the event that the discussions involve any other products or firms not already on the agenda for which FDA participants have a financial interest, the participants' involvement and their exclusion will be noted for the record.

With respect to all other participants, we ask in the interest of fairness that they address any current or previous financial involvement with any firm whose product they may wish to comment upon.

DR. LAYLOFF: Okay. Thank you, Kathleen.

I'd like to no go around the table and have you introduce yourself and your affiliation. We'll start with John James.

DR. JAMES: Yes, good morning. My name is John James. I'm the Executive Director of Operations Services for Teva Pharmaceuticals.

DR. SHABUSHNIG: Good morning. I'm John Shabushnig, and I'm the Director of the Center for Advanced Sterile Technology at Pharmacia Corporation.

MR. COOLEY: Good morning. Rick Cooley, process analytical chemist with the Management Technology Group of Eli Lilly and Company.

MR. WALTERS: Good morning. I'm Colin Walters, Schering-Plough Product Optimization. I'm a senior engineer.

MR. CHISHOLM: Good morning. I'm Bob Chisholm of AstraZeneca International, Technology Manager based in the U.K.

MR. WETSTONE: Good morning. I'm James Wetstone, the Chief of the Process Measurements Division of the National Institute of Standards and Technology.

DR. TIMMERMANS: Good morning. Jozef Timmermans from Merck and Company, Manager of the Pharmaceutical Technical Operations Group at West Point.

DR. WORKMAN: Good morning. Jerry Workman, Senior Research Fellow of Kimberly-Clark in Wisconsin.

MS. SEKULIC: Good morning. I'm Sonja Sekulic, Assistant Director, Technology Development at Pfizer in Groton, Connecticut.

DR. RUDD: Good morning. David Rudd from Process Technology in the Pharmaceutical Development Group in GlaxoSmithKline in the U.K.

DR. MILLER: Good morning. Ron Miller, Principal Technology Fellow, Bristol-Myers Squibb.

DR. SHEK: Good morning. Efraim Shek, Divisional Vice President for Pharmaceutical and Analytical R and D, Abbott Labs.

DR. ANDERSON: Good morning. Gloria Anderson, Gallery Professor of Chemistry, Morris Brown College, Atlanta, Georgia.

DR. KIBBE: Good morning. Art Kibbe, Professor of Pharmaceutics and Chair of the department, Wilkes University.

MS. REEDY: Kathleen Reedy, Food and Drug Administration.

DR. LAYLOFF: I'm Tom Layloff and I'm an SGE with FDA, but my day job is with Management Sciences for Health and International Pharmaceutical Regulation.

DR. BOEHLERT: Judy Boehlert. I have my own consulting business, consulting in the areas of quality, regulatory affairs, and product development.

DR. KOCH: Good morning. Mel Koch, Director of the Center for Process Analytical Chemistry at the University of Washington.

DR. SEVICK-MURACA: Eva Sevick with Texas A&M Department of Chemistry and Chemical Engineering and developing new technologies for blend content uniformity monitoring.

MR. HALE: Tom Hale, President, Hale Technologies.

DR. MORRIS: Ken Morris from Purdue University.

DR. HUSSAIN: Ajaz Hussain, Office of Pharmaceutical Science, FDA.

DR. CHIU: Yuan-yuan Chiu, Director, Office of New Drug Chemistry, FDA.

MR. ELLSWORTH: Doug Ellsworth, Office of Regulatory Affairs, FDA.

DR. LAYLOFF: Thank you very much and we'll now turn to Dr. Ajaz Hussain. Ajaz, you're up.

DR. HUSSAIN: Good morning and welcome to the second meeting of the Subcommittee on PAT. My handout should be outside for those in the audience, and copies of the handouts have been distributed to the subcommittee this morning.

I just want to share with you some thoughts on how the goals and objectives of this meeting and share with you some progress we have made within the agency and where do we go from here.

I also wish to thank several invited guests whose names appear on the program, and others who will be speaking and will be participating, for example, from NIST and from Measurement and Control Engineering Center in Tennessee. Professor Kelsey Cook, I see him in the audience--there he is--and so we hope this will be an exciting program where we can brainstorm and bring a lot of information so that FDA can quickly, and as quickly as possible, develop a guidance on PAT.

For those who are attending this meeting for the first time, the goals and objectives of the FDA's initiative is to use PAT or Process Analytical Technologies as a model technological opportunity to develop a regulatory framework to facilitate introduction of new manufacturing technologies that enhance process efficiencies and understanding. I think those are the two aspects which create the win/win from both public health, as well as industry perspective. With increased understanding of processes, we reduce the risk of poor process capabilities and so forth, at the same time increase process efficiencies.

The goals and objectives of the discussions today are to identify and eliminate perceived or real regulatory hurdles, and these are the goals for the general guidance that we are trying to develop. At the same time, we are trying to develop a dynamic, team-based, scientific approach for regulatory assessment--a review and inspection team for these new technologies. I'm pleased to let you know that we have essentially assembled this team of reviewers and inspectors, and some of them will be participating in this meeting also.

And also, last--but not the least--I think we have to start moving and thinking about international harmonization. EMEA, CPMP have issued a guidance in September on parametric release which has certain bearing and certain commonalities with what we are trying to do here, but at the same time, I think there are significant fundamental differences that need to be identified and resolved. And some of that discussion will also happen today.

One question that comes up is why process analytical technologies? We believe process analytical chemistry has sort of matured and has proved its usefulness in many other industries but has not really been adopted in pharmaceuticals to a large degree.

We believe that PAT provides an opportunity to move away from the current testing-to-document quality paradigm to a continuous quality-assurance paradigm that can improve our ability to ensure quality was built in or was by design, and we think this is the ultimate realization of the true spirit of cGMP.

One of the things which excites me personally with the PAT technologies is you actually bring physics and chemistry together to bear upon the measurements that you are dealing with. Traditionally, we look at--actually destroy the physical information by dissolving and then doing an assay. So that's in my mind a significant advance with why PAT can help us.

We believe PATs--optimal use of PATs can provide greater insight and understanding of processes, bringing these technologies at or in line to measure performance attributes is a better approach than taking sampling--or taking samples and testing in the lab.

We also have the possibility of real-time or rapid feedback controls, which is generally not practiced in the manufacture of pharmaceuticals because this can allow us to focus on prevention; potential for significant reduction in production cycle time and, in parentheses, in development. I think this is one of the challenges that we face today with PAT. Many of the champions for PAT in pharmaceutical companies are in manufacturing. The R&D folks either have not embraced this to a degree or are, in fact, opposing it. And there are many reasons for that. In fact, one of the reasons is many of the formulation development folks probably do not have the level of understanding of what PATs can do for them. And they're so in tune to the traditional ways of making formulations that there really is an educational campaign that needs to occur.

But I think more importantly we minimize risk of poor process quality and reduce regulatory concerns. I don't have to sort of outline the regulatory concerns in the manufacturing areas. You see those examples on a daily basis. And my concern is with the crunch in development due to pressures of getting the product out at any cost is going to increase the problems in the future. If we don't bring new technology in, the manufacturing problems are on the increase.

The strategy we adopted was a win/win situation. We wanted to create a win for industry, a win for public health. And we approached this with input from the Advisory Committee for Pharmaceutical Science, the parent committee of this subcommittee, and also the FDA Science Board. And the reason for the Science Board was to bring a high level of scrutiny as we develop this program, because in some ways this is a paradigm shift from a regulatory perspective. And you need all of FDA to be part of this, not just the Center for Drugs.

We have established internal collaboration between CDER and ORA. We have a PAT steering committee. The external collaboration, in my mind, is this committee. And, hopefully, in the future we'll use PQRI to some degree for this.

We are moving down two parallel tracks. Track 1 is a general guidance on PATs, not focused on any technology, per se. The intention is to simply bring common terminology, as well as provide guidance on a regulatory process for bringing PATs in a regulatory framework.

You could imagine this guidance as Chapter 1, introductory chapter to a book if you are writing a book on PAT. What it means is, subsequently, we will have other chapters, other guidances, more technical guidances as we gather more information and we are able to write those technical guidances.

We are encouraging submissions now. And we are planning to have a team approach for review and inspection for these submissions. I am pleased to say we already have one submitted and in terms of a company has already come forward. The second company is working towards that, so we have two companies which have expressed interest.

A progress report could be sort of looked upon as the meetings that we have had. The first meeting on PAT was on the 19th of July 2001, then the 16th of November FDA Science Board meeting. One of the major aspects of discussion here was that PATs need to be voluntary. These need not--these would not be a requirement. So a company can choose to use PATs, but it's not a requirement. So that was one of the fundamental aspects that we established with this meeting.

At the second Science Board meeting, we established the concept of a safe harbor or at least discussed the concept of a safe harbor, which I'm hoping that this committee will help us define it. I don't like the term "safe harbor" because--and I haven't used it in the questions that I framed to you, because I don't think we need a safe harbor. All we need is clarity of how regulatory decisions are made, and I think it will be fine. Personally, I don't like the term "safe harbor," but you could use it if you want to.

Now we are at the second meeting of the PAT Subcommittee. We originally had planned for only two meetings, but our task has sort of increased and we will have a third meeting of this committee.

Let me share with you the time lines. We are here today, the red arrow, the second subcommittee meeting, and the third subcommittee meeting is being planned late September, early October, sometime on that time frame. We haven't even started discussing what exactly the date would be. What we hope to do is to gather information from you relevant for inclusion in our draft guidance, which we hope to have an internal draft ready--I can't commit to a release date, because that's totally not under our control--so we will have a working draft internally, which we hope to get out as soon as possible for public comment.

We would like to start our training program in October, and I look forward to receiving input from you on how we should structure the training program and the certification program. So that's sort of Track 1.

Track 2 is submissions now. The first company has come in, and that track essentially got started in May. So we are moving on Track 2 at the same time. Those small microphones or loudspeakers there, since we indicate a lot of the presentations that we do--I've lost track of the number of presentations I have done on this. It's sort of fallen through the track. I just wanted to emphasize I've been visiting companies like Aventis, BMS, Pfizer, AstraZeneca, and others, trying to gather, you know, build consensus, as well as gather information of how best FDA should develop this guidance.

Let me briefly talk to you about Meeting 3. What will Meeting 3 focus on? One issue which we'll focus--we'll focus on a computer validation, including chemometrics part of it and Part 11 issues, because we still have a number of issues to resolve and we want to focus on those today and tomorrow, and Part 11 issues, computer validation issues, will be tabled for the next meeting.

Rapid microbial testing, we are sort of expanding the scope of tools that we use in PAT to include rapid microbial testing. And our Advisory Committee at the last meeting endorsed that that should be part of this. We don't have all the talents, scientific expertise on this committee to handle all the microbial issues, so we plan to use the third meeting and include some more members from microbiology to participate in that meeting to see how rapid microbial testing could be part of the PAT initiative.

The third thing which I would like to do--and I need your help for that--is at the third meeting, I would like to have a dry run. What I mean by dry run is using a mock application submission inspection. Can we use an afternoon session and actually walk through a submission and the review and inspection questions that could come from that?

I need your help because I think I'll need you to help me create that mock application and so forth. So, please give me your suggestions on how we could do this. What I'm hoping is we could focus on maybe two case studies: a drug substance manufacturer, we could use online GC or HPLC as a model or a Raman technique. And go through that process and see what are the things that we haven't addressed should be addressed in the draft guidance. And for drug product, what I'm suggesting is we could use online NIR infrared for blending, drying and so forth, to create that mock example and walk through that.

So, today, day one of this meeting, we have clearly defined the questions for the subcommittee. It's in your handout packet. We have provided for you our current thinking and posed those questions. And these questions deal with regulatory uncertainty or risk and how best to address those. So most of the meeting today would focus on those questions.

But we have left the questions undefined for the working groups. I'm hoping that you will frame those questions toward the end of this day and how we want to manage the working groups. And we have built in flexibility. We were planning two working groups: one on validation, one on development. But, for example, if we need a third working group on training and education, we could have that group as a possibility, or a fourth working group, so we have accommodations available. I'll look for your input on how best to manage day two.

In my handout, the last page, for example, is a set of questions that we received from Jozef Timmermans from Merck, of what Merck thought were the questions relevant for validation. So you have those set of questions for the validation group, and I'll also pose some additional questions here. But towards the end of this day, if we can sort of refocus those questions and come to some agreement of how, what are the most important questions to discuss.

Training and certification program is an important topic, and we really look for some feedback from you, and then we'll identify questions for in-depth discussions by the working groups on day two.

Process validation working group definitely will be in this room. We will have--that probably will be the biggest working group. Product and process development working group would also be--definitely be there, but other working groups could be training and certification and possibly a regulatory process. I'm excited to see, you know, Jeff and others from Regulatory Affairs who have joined in. So that could stimulate some of the discussion that if possible.

For example, I think, the questions that we had in mind for the working groups, I'll just lay them out for your consideration.

Please identify and describe approaches for introducing PATs, for existing validated products, for new products. I mean the type of questions that we are--the type of information that we are looking for is some sort of a scenario of the steps necessary to do this and how the regulatory system should interface and when should it interface.

For example, PAT R&D efforts in pilot plant, a company may start at the pilot plant to establish proof of concept and suitability for application in manufacturing. What should be documented to justify suitability? PAT R&D efforts could then move to manufacturing where you'd actually say, for example, blend--bring a blender with online NIR, same design and operating principle, and run that in parallel to your current manufacturing.

What should or would constitute acceptable verification of suitability and validation under that conditions? And once you have established that for routine manufacturing using PAT, what should be the regulatory standard for accepting an online measurement to replace end-product testing be?

What is the level of built-in redundancy? If the sensor fails, what is the backup for that? And then identify steps to resolve out-ofspecification observations. Under what conditions can end-product testing be used to resolve out-of-specification, because you are looking at a validated process in a traditional sense, why can't we use that as a backup system?

The distinction here I think you have to pay attention to is the parametric release concept originally initiated from terminally sterilized parental product. Under that scenario, any deviation from the validation, sterility testing is not a viable option. You cannot rely on sterility testing to release a batch if something happens in your manufacturing. So, it's end of story then.

But PAT, in my mind, is somewhat different. So I think we have an opportunity to define under what circumstances end-product testing could then be a reliable way of resolving this. But I need your help to define that for us or sort of discuss that.

Continuing on, the questions for working groups from an FDA perspective. Using online NIR for blend drying, content, and dissolution and an HPLC as an example for PAT, please outline the essential experiments--what I mean by experiments is hypotheses or questions to be posed--that should be conducted by a company to successfully develop and validate these tools for use in manufacturing operations. I'm essentially setting up this for the next meeting.

What criteria should be used to ensure that relevant critical formulation/process variables have been identified and appropriate PAT tools selected to ensure their optimal control?

What information should be collected to justify use of indirect measurements, such as signatures or correlations, that relate to product quality and performance attributes?

When and to what extent would FDA involvement facilitate PAT R&D and application projects? And so forth.

So those are sort of our suggestions, combined with the questions from Merck and questions that you have, that I think will frame the discussion for tomorrow.

I just want to emphasize again, sort of--but I want to end my presentation with just sort of a case study. The general guidance--I want to emphasize so that I'm not creating a high expectation. The general guidance is not a technology guidance. General principles and terminology is what we will focus on. Address issues related to regulatory uncertainty and clarify the regulatory process. We hope there are other tangible benefits: serve as a tool for building consensus, especially within-company consensus, and promote research and development in this area.

Some thoughts on general principles and terminology. The first question that is posed to you in your handout is definition and scope of PAT. I think it's important to define that very carefully and clearly.

And, also, I'm asking you to sort of develop a shared vision for this group. What do you--what does PAT mean to you? What is the current state and what is the desired state you are trying to achieve using this new technology?

From my approach or from my thinking, the win/win comes from higher level of process understanding, functional or performance indicating process controls and specifications that we'll set using a systems approach; high level of process quality; minimal reliance on end-product testing; improve the scientific basis for regulatory functions; rational risk-based documentation requirements. And the point there I'm trying to make here is, currently, the current manufacturing paradigm essentially is the GMPs have to be very, very laborious and documentation is so critical because, in many cases, the manufacturing is a black box, and we rely on very limited end-product testing just because of the extensive GMP documentation requirements we have.

Any deviation from that results in a problem. But now, when you make the process more transparent, what should the documentation be? And that's somewhat a Part 11 issue, also, that we'll discuss. But, also, clearly high efficiencies for all operations, from industry and FDA operations.

So, my thoughts on PAT, I see PAT as a tool in a whole quality system. And here is a quote from a book on total quality control which was published in '83, and it sort of charts out the evolution of quality systems in the U.S. In the 1900s we relied for quality only on the operator, then we added a foreman, then we added the concept of inspection, then we moved to statistical process controls in the '60s, and then we went through the concept in 1980 of total quality, now we generally talk about total quality management system.

And the point here I think is that "Real assurance of quality today requires far more than good intentions, testing and inspection activities, and a traditional quality-control department." This was said in 1980. "It takes the same business, managerial, and technical depth to assure that the quality and quality cost of the product as it does to design, make, sell, and service the product itself - depth starts well before production begins and ends only with [customer satisfaction]."

What I see is PAT is a tool that enables us to move in this direction. Many have or some have argued that the pharmaceutical--there's no role of statistical process control in pharmaceutical manufacturing. You know, I read a book by John Sharp from the U.K., and it's a very well written book. I agree with all of the things he has said in that. But towards the end he said we are not making, you know, machines and so forth, so statistical process control has no role in pharmaceutical manufacturing. I said that's old thinking. And we'll leave it at that. So PAT is a tool that enables us to move in that direction.

A second sort of perspective on PAT is that if you look at the facts or the trends in quality, we started in the 1950s with sampling plants, then came the zero-defect movements in '60s, ISO-9000 in the '80s, you know, quality system 9000, Malcolm Baldrige Award, European Quality Award, total quality management. Now the buzzword is Six Sigma and the buzzword has changed to Ultimate Six Sigma, and so forth.

The point here is GMPs came in at that point, and if we don't understand processes, all these are fad because what is--unless you understand the variability, the sources of variability, you really cannot improve quality, you cannot go to Six Sigma. And with the measurement systems we have, we don't have a hope of getting the pharmaceuticals in this direction. So that's what I see as PAT coming in to help us move in this direction.

Now, let me sort of end my presentation with this sort of a case study. The case study is a study that helps me look at PAT. And what I would like to do is take a case study which people consider as the most difficult case study. How would we do on or at-line assurance of acceptable dissolution rate? Okay? And it's a hypothetical case study, but with real data. And the real data is FDA data.

So now let's imagine dissolution of a tablet is a function of particle size of the drug, amount of excipient 1, amount of excipient 2, a process parameter 1, a process parameter 2, okay?

Process parameter 1 is, say, blend time. Process parameter 2 is a compression pressure or force, and you have an in-process test of hardness.

Currently, the way we assure quality is you have level 1 quality assurance, which is essentially the GMPs: specs of incoming materials, SOPs, process controls and so forth. And then level 2 quality assurances test conformance to dissolution specification and along with other specifications.

The data is real. Why I'm calling it a hypothetical case study is because we did this study in a retrospective fashion. We had just finished our manufacturing project at the University of Iowa. The drug is furosemide. And we had designed an experiment of different formulations and we were ready to do biostudy. So we wanted to link NIR infrared analysis to the biostudy because that is possible now because you're doing it nondestructively. So we can actually measure the amount of drug in a tablet and also estimate its dissolution rate before you give it to a patient. So that was the link. But here is for dissolution. I don't have the data for biostudy yet.

What we could do is take the NIR infrared spectra of a tablet, measure the dissolution of the same tablet, and establish a correlation. And here we have taken the entire spectra. And so you have an at-line tablet NIR spectra and a dissolution correlation. So you have a training set, which is the graph, and then you test how good this correlation is using a test set which is different. And what you see there is you have wonderful predictions and if the end-product testing is a one-point specification that Q is more than 80 percent dissolved or 70 percent dissolved in 30 minutes, there's no problem in meeting that requirement. But the concern I have is this is a black box. Validation of this is based now on predictive performance of the calibration. In fact, that would be probably equal in terms of regulatory requirements to what we do with in-between real correlation. If you look at our guidance, how do we make decisions to waive biostudies when you have a correlation that's based on predictive performance only?

So that type of correlation validation would be consistent with our current standard for waiver of biostudy. But I think we can go a step better. What are the critical formulation variables in this? For this formulation, dissolution was predominantly affected by the disintegrant level and by interaction terms involving disintegrant and diluent and diluent and magnesium stearate, so, we know it was mainly composition based.

The hardness, the compression pressure really did not have an effect. And that's typical of formulations that contain a super disintegrant; you actually eliminate all the process variables because the super disintegrant takes over. So that's consistent with that mechanism. And when you do a modeling of those components and dissolution, you are able to explain 93 percent of the variability. So it's a fairly decent relationship between composition and dissolution.

So what we could do is here, I told you we have a black box, but the black box says it could be a hat trick and we could actually make it more transparent and make it more science-based. And now the proposal here is you can take the NIR infrared spectra, you know the critical variables, link those together. Can we measure those components? And we could. So you have taken a step beyond a validation of correlation of a black box to something which is a meaningful link directly to the variables.

We did it at line. I don't see any problem doing this on line or even taking it further from behind. Using blend uniformity data and some tablet compression data you can actually do this. So, by doing this, I think what we have been able to sort of gather is these are pretty straightforward things to do. And all we need to do is make these available.

The challenge comes as--that was a small-scale study. We did that 3-kilogram or 5-kilogram batch. Then the question would come as, how, when you scale up. will that still remain? We didn't scale up in that--we did scale up but I didn't have the data on that one. We did scale up to 16 kilograms--but I'll show you a different example which showed the scale-up aspect.

Here is an example from Metoprolol tartrate and the box that you see on your left-hand side, upper side, is a designed experiment dissolution rate. And in this case, dissolution was a function of magnesium stearate, microcrystalline cellulose, and sodium starch glycolate.

We linked it to dissolution in bio, on the right-hand side. But this work was done on small scale at the University of Maryland as part of our SUPAC research program. We didn't stop there. We said, can we generalize that small data set to the submission data that we have in-house? So we took that information, developed a new network. This work was done by Vijet Damara [ph], who is now at Sanofi. He did that when he was a reviewer here. And he actually predicted what the dissolution should be of the generic tablets and the enumerator tablet from our submissions. For all but two, we could do that very, very well. And that took 10 minutes.

The two formulations that we were not able to, the difference was their ratio of sodium starch glycolate and mcc inside or outside. There were significant differences that it really didn't fit the pattern. But for the rest on, it did. So the scale-up could be sort of verified, that scale-up was not an issue. And we didn't have the NIR at that time but we could have connected it to that.

So, that's the concept. I think we need to understand that when we do experiments on a small scale, in the traditional way when we don't have the right measurements, it's difficult to scale up rationally. And here is an example, I would like to use from Ken Morris and Purdue. When you do on-line analysis of blending and are able to gather information about the kinetics of blending, you can actually model and predict what the large scale should be. And Ken is here; he could talk to you about that, but he has done this only for drying and for blending.

So, with PAT, you are actually gathering far more scientific information to actually do rational scale-up and be predictive of what can happen, instead of saying, oh, the scale is not going to work.

I'll end my presentation with sort of built-in redundancy. I'd really like to have you think very differently about this. Redundancy need not mean two systems. For example, I have a NIR unit one which is measuring some attribute. We want to have a backup system for that. That doesn't mean that I have to have two NIR. The picture there shows different location of NIR for blending. That's only to illustrate that we don't need to have multiple sensors, but simply look at redundancy as a systems approach.

For example, when you look at a systems approach, the overall quality system is the first level of defense. Then comes product- specific SOPs, your raw material classification and so forth. That's your second parameter of defense. Once you get through that you have actually eliminated sort of variability. Then comes PAT and then comes so forth.

So, when you look at a systems approach that comes up with a thing that there are many tests, many measurements that actually overlap and you can actually use them as backup and need not be two separate systems. So, I think we have to start thinking about that in sort of different ways.

With that I'll stop and give it back to Tom.

DR. LAYLOFF: Thank you, Ajaz. It's a very exciting time. I think the last slide brought an interesting point. I think it's an aggregation of measurements that are critical to the product quality, not a single dimension at a point in time.

I think it's also very exciting that we're going to be doing microbiological tests because if you look at the chemical tests, it's fairly easy to see that you can change the technology without changing the bar, but I'm not sure that--how difficult that's going to be with microbiological testing when you shift from microbial limits on plate counts to DNA or other technologies with it, but the bar may actually shift.

Anyhow, also, I think critical the critical issue is going to be for us is the training and certification. The competence of the reviewers and the investigators are going to be the keystone for this whole process. If we don't have well-trained reviewers and inspectors, this thing will not go well. So, your input as to content, structure, certification of competence are going to be really critical in how the FDA moves forward on this.

And as Ajaz mentioned, we've gone from four committees to two, but that's a flexible yardstick. We can move to back to four if we need it, and we'll look to you all for guidance as to whether we should increase the number of committees that we break down into for the guidance.

Now, we have the subcommittee discussion on training and--

DR. HUSSAIN: Why don't you go to the invited speakers and then--

DR. LAYLOFF: Okay, let's go with--okay, we'll change that around, okay. Let's go with Jeff, Jeff Blumenstein, from Pfizer, formerly FDA.

DR. BLUMENSTEIN: Thanks, Tom. We welcome the opportunity to share some thoughts today on PAT. Let's see, there we go. I'd like to present some perspectives about some regulatory challenges that may be relevant to PAT applications in new NDAs.

When we go forward and try to develop new NDAs, it's really, from our perspective, a balance of goals. You know, we're developing a new product and it's a balance of activities to try to meet mutual goals--designing the product and processes, the methods, as well as other goals like facilitating the rest of the program, the production of clinical supplies. And then, looking toward the commercialization, developing the process knowledge, transferring the technology. So it's a number of different drivers, and at the end of the day we're all trying to balance different things, like time, resources, and costs. With time, people, and money, we can always do everything, but at some point at the end of the day we've got budgets to maintain and time lines to try to bring new products to market.

And that really is where the balance of goal comes about. First, trying to facilitate the commercial aspects about manufacturing right the first time. But in a reasonable time frame for the number of types of experiments to do to get the drug to the patient, because that's what we're all there for is to bring new NDAs and drugs to patients.

So with that, what are some opportunities for PAT in new development of programs? It really is a process knowledge tool, so we're trying to build the information set for commercialization, looking at all the various variables and capabilities that could be in there from scale, component variation, many of the things that Ajaz already mentioned that experiments are ongoing with. As well as fundamentals with regard to formulation development, formulation solid-state interactions. It gives us a wealth of knowledge about that.

But as we're developing that knowledge, I think that we're a bit cautious about is that it probably really isn't an optimized control tool for clinical development batches. The clinical development batches will probably provide an opportunity to gain that process and product knowledge, but it's probably not developed to the control tool at that point in time.

As we look forward to putting together the NDAs, what are some potential challenges towards the application of PAT and development programs? Well, comparison is often difficult. We, as we're going through development, batches are often unique experiments for scaling up, developing new pieces of equipment, moving it from site to site. So some of those parameters are changed. We're evaluating the impact on those--on the product that those various aspects and product characteristics, but it's an evolution as we go through development. And, similarly, you know we speak very frequently about PAT in drug products, but there's certainly opportunities in drug substance, but coupled with that synthetic processes are evolving. The route may be the same but, again, we're looking at all the various aspects about changing and scale-up as we go through that.

And in some cases, depending on what the clinical needs are, the size and scope of the program. Experience may be limited. We may not be making a huge number of batches, really, to look at.

So with that being said about the downsides, I think there are, you know, certainly, some positives. Is that we can look at in development and try to determine what parameters are appropriate for monitoring. We may not determine what all of them are, but it's the beginning part of the look.

As we mentioned, also, the commercial process may be limited at filing. Where certainly at the limits of scale is often in small scale, but we're moving towards the commercial manufacturing sites. But the number and limit of experiences is something we have to deal with. And I guess the one other piece to emphasize, as well, is that very often development processes are very rapidly moving and some of the parameters that I mentioned in the first slide about the challenges, were often material limited. We're trying to serve many different customers in the development program, so we have to be cautious about which ones to serve, but that may limit, perhaps, experiments for how many batches we want to make, say from a commercial or manufacturing perspective because we have to make sure the clinic stays supplied, as well.

As we look towards, you know, potentially some of the regulatory strategy, what are some of the other challenges towards the application of PAT and new NDAs? In many cases, at least, at this point in time, reference methods are probably still going to be required, whether they be for regulatory surveillance programs by the FDA or other authorities. Compendia monographs, at this point, we don't have a plan for how the USB is going to accommodate that if we have a different product coming off the shelf, because PAT may be relative to the process as well as the product. As well as acceptance testing in global markets. I know this is a U.S. FDA committee, but as a global organization, we look at, you know, certainly worldwide approval of many of our products and many of them will still have certain limits on acceptance testing to bring product into their market. So, you know, we're looking at a global regulatory program and many cases will need acceptance testing for some of those global markets.

I've touched on, already, the size and the scope of the database with which to set criteria. You know, in many cases, we'll have our best thinking, but what's really a normal process and what's a variation from that normal process versus a true variation and a failure in the process with that limited database is something that's very challenging as we're putting together the NDA.

And the other aspect is, technology evolves over time. As much as we do try to bring forward NDA programs very rapidly, sometimes it is a multiyear process and many of you that are much more deeply entrenched in the technology know, that by the time we actually file an NDA, the technology has moved. So what we start actually looking at with the process in the first couple of clinical batches may not be the best technology tool that we really want to move forward within commercialization. So we have to be cautious of not handcuffing ourselves by looking back towards that early development experience and the tools available at that point in time.

So, as we look forward to that from our perspective, what are some options for some new dossiers? You know, we could just briefly describe PAT that we anticipate doing towards commercialization and just sort of let the agency know where we think we're going in the future. Or we could go to something more rigorous--change protocols, they've been discussed in various other aspects about filing NDAs and PAT might be a good example of that. We might do things like describe what is the body of data that's going to be needed in the future to move forward full acceptance of PAT?

What changes with the adoption of PAT? Are we going to drop some of the conventional tests? Are we, in fact, going to actually change the manufacturing description? Is PAT going to be an end-point rather than just a control tool that we may do some manufacturing process limits to? And that's just, you know, some illustrative examples of what we may look forward to.

We have to be conscious, as well, if we put a change protocol in. We probably need some description about discontinuing PAT activities. As we go towards commercialization, what if we learn that it may not be the best thing to control, so what will we do if we want to roll-back and relook at something? So any protocol may need to talk about discontinuation of PAT activities or regrouping from a totally different direction.

And, as with anything, like a change protocol, we have to talk about filing mechanisms. Is it going to be, you know, upon commercialization? A supplemental filing at some point thereafter and then we can talk about is it going to be a prior approval? Is it going to be a CBE--a CBE-30, all of those different aspects? Or are we going to be so comfortable in the future, it'll just be an annual report?

So not to be too down, I mean, I've spoken about some of the challenges. But I think that PAT does afford some great opportunities. It does allow us to gather the process knowledge early in commercialization. If we get on-board with what we're going to test, I think, as Ajaz mentioned, the potential for understanding and looking at the process is great with PAT and managed well, it can provide some great input to early commercialization.

With the protocols, it also allows us the opportunity to agree on the dataset being developed so we don't have any second-guessing thereafter. It allows that we make sure we've got the full dataset and we don't have any gaps.

And one of the more challenging aspects is, will it go to such a level of detail that we'll agree on the criteria for success? And that may, again, come back to what's going to be the mechanism for it? If it's something as straight forward as an annual report type change, we'll definitely need to talk about criteria for success. If it's just going to be the more broad narrative descriptions about what we're going to monitor, that may be something we have to look back at later, and then negotiate on.

And, certainly, as I mentioned, if we have other methods, like reference methods, the protocols, we'll probably certainly need to talk about how we're going to correlate to those reference methods.

So, what are some of the risks? One aspect is that, as we go through and we actually look at it, that the PAT information may suggest that we really, our initial thoughts were really pretty far off the mark and we really have to change things so we have to handle it differently than we originally thought.

The monitoring, as we go into commercialization with PAT may suggest that we see things we didn't appreciate with the early reference methods. And I know that's certainly an aspect, certainly as we look towards protocols and, perhaps, a bigger aspect as we look towards post-approval type things, like supplements if we move towards that. What do we learn about old products? And I know it's going to be a topic of some discussion, as well.

We're not the only ones sharing risks. I think the FDA, as they look towards things like protocols, especially if we look towards change being affected, supplements or annual reports. They're going to have to accept to a commitment for a future change with a very limited dataset and how comfortable is that going to be?

If we really look towards some of the opportunity being in the post-approval setting, we--maybe we wind up talking that it may be a different area, so we have to think about two different aspects of that: One, post-approval review burden. And the other is: Is this going to be another piece of an NDA in an already very constrained resource environment during NDA reviews. And we have to just be cautious that it doesn't detract from the approval of the NDA and slow down the process.

So, in summary, the opportunities for PAT, you know, they do exist and they're very valuable. At this point, in looking forward, the opportunity may really be in a transition to post-approval activities. Is everything going to be so ready and finalized that it's going to be ready by the time of NDA submission and we'll be able to roll into that? At this point, that's, from our perspective, probably unlikely.

The challenges do exist, both from the FDA's perspective on the need to make the information available so that they can make the right judgments. And, also, from our perspective to make sure that we get new products out there as rapidly as possible.

So, with that, I think the committee certainly has quite a bit to speak to of looking forward to the opportunities in trying to balance the risks and I look forward to hearing the discussion on those topics.

DR. LAYLOFF: Thank you very much, Jeff. I think in our discussions on PAT, many--we've discussed several times where we didn't think the bar should raise and there is a certain acceptance of what a quality standard is for a product. And that probably will stay with some kind of reference method that we could use in stability in testing or something like that and the PAT--that would be the ultimate reference for it in the PAT. And the PAT would be just targeting that.

Is Steve here? Is Steve here. Okay now we're going to go to Dhiren Shah, from Aventis.

DR. SHAH: Thank you. Good morning, everyone. I'm really pleased to be here to share my thoughts and my company's thoughts on post-approval PAT application and the challenges around it. First of all, I would like to thank Ajaz Hussain and FDA to invite me to come to this meeting and share my thoughts.

As a way of outline, I would like to discuss, first of all, what is the need for post-approval or what I call PA-PAT applications? Is there a need for that, you know? And if there is, you know, how do we address that?

Challenges in PA-PAT applications. Once a product is approved and commercialized what are the challenges in bringing PAT in the regulatory area?

PA-PAT applications to APIs, the drug substances, Ajaz spoke about that little bit, and Jeff to the APIs as well as for the drug products. How do we apply PAT once a product is approved?

Then the real important point from regulatory perspective or a pharmaceutical company point of view, the guidance development, you know, the guidance to the industry that when you apply PAT to an approved product, how do you that work about? CMC review point of view. What do you need from CMC review? Type and amount of CMC information needed? This almost sounds like SUPAC guidances, you know, that's what the workshops and the committees did for SUPAC, that how much CMC information is needed, what type of CMC information will be needed to show equivalents? And regulatory submission type. Jeff spoke about it, you know, the standard, prior approval supplements, all kinds of changes being affected or annual report kind of reporting.

And then on the compliance side, you know, the second part of the equation, which is on the compliance side, which needs to be totally discussed. And then I'll give some summary and concluding remarks?

Why do we need PAT--PA-PAT? Improve quality of existing products. There is no doubt that pharmaceutical industry, in general, is behind rest of the other industries, later industries--food industries, chemical industries. I've been in this business for 25 years and I still, I know there are products being made with very old technology, simple mixer and stopping the mixer and putting the hand in it to fee the granulation is done or not. Honestly, that's, you know. And, of course there are technologies which are high-shear granulators where you have, you know, kilowatt end-point measurements for granulation being done. But technology-wise, the pharmaceutical industry is backward, it's behind.

And, again, it's by necessity, you know, the nature of our business is such that we stay with that.

Improved analytical testing. Again, we present 80 samples, you my have a batch of 1 million tablets and you may take 100, 200, 500 tablets out of that whole batch and you hope that the samples really represent the whole batch. And that's a big risk thing.

Increase manufacturing efficiencies, again, in some cases you can really improve manufacturing efficiencies by applying PAT.

Reduce, hopefully, eliminate other specifications, avoid potential recalls and enhance compliance, they all go hand-in-hand. But, by applying PAT if we can really reduce our specification results that will be a big achievement. And, of course, when you add all of those there will be--I am sure that there will be potential long-term cost savings to the companies and ultimately to the patient.

Challenges in PA-PAT applications. There are two kinds of post-approval situations, in my mind. The first kind is products without PAT applications in the original submission, which is majority of the cases, right now, because products have conventional controls where you don't have PAT. Now how do you apply PAT post-approval?

Identify process-critical control parameters. You know, once you identify, then you can think about applying PAT to those critical processing parameters.

Replacement or adjustments to in-process controls and, possibly, final specifications. Once you find out that certain PAT can be applied, for example, for blend uniformity, or for tablet hardness, how do you take the conventional, in-process specification and then apply PAT to that? And how do you replace that? And that's a challenge.

Correlation between PAT-based controls and approved conventional controls. This is very obvious that you already have products with conventional controls in place, how do you correlate that with the PAT control?

And of course, the review and compliance issue. This you will hear time and time again, at the end of the day, you know, our products are approved and when you make changes without the review processes, without the compliances processes that will be used to allow us to change to PAT.

OOS--out of specification--that will happen, you know, that has always happened, with the best intentions--with the best products, out of specifications occur, how do we handle that?

And, in my opinion, for products which do not have PAT, it may be difficult--not impossible--difficult to apply PAT post-approval.

The second scenario is, obviously, for products where you already have PAT, that is, again, looking in the future. You know, right now as we understand, there are not too many prods with PAT in place for manufacturing the commercial products. For these type of products, changes to approve PAT-based controls, Jeff talked about it a little bit. That once you have some PAT controls in the initial phase, but then you learn, with time, that maybe you don't need that or you want to replace it. So you may want to change the PAT-based controls after approval. Addition, you know, you may realize or you may understand the process more and you may want to add a new PAT-based control for a given product.

Deletion of a specification to eliminate non-value-added controls. In the, again, with a limited experience, going into NDA, you my have some in-process controls, but as you learn that some of those are, say, for example, non-valuated, how do we replace those or eliminate those?

Again the review and compliance process, that needs to be defined. Same old question, out of specification, how to handle it? And I believe, in my opinion, it will be much easier for products which already have PAT in place to make post-approval changes.

For the APIs, very quickly, how do you apply PAT post-approval to APIs? The first, is there is no change to drug substance pathway, it remains the same. And then in-process controls, such as impurity levels, at different stages of synthesis, maybe you want to monitor using PAT. Residual solvents, including, moisture. Examples, could be completion of reaction, whether the reaction is completed at a given stage or not.

Isolation purification steps; initialization and completion of crystallization at the very final stage.

So, those type of things can be applied, those are the examples which most of us know for in-process controls for the APIs.

Correlation between the conventional IPCs, in-process controls, and PAT-based in-process controls. Again, we need to have some sort of correlation. And once you have PAT-based in-process control continuous monitoring, how do we handle API specification? And what will be the role of the final specification for drug substance?

And we all know the question about parametric release, which started out in sterilization area, but can we apply parametric releases after we have certain appropriate PATs in place for the APIs?

Post-approval PAT applications to drug products. Again, there is, I'm--there is no change in drug product components, composition, and basic manufacturing process.

Drug product--maybe we can consider drug product type dependent PAT applications; solid oral dosage form, both immediate release and modified release; sterile products, semisolids, so we can consider based on the noted form dependent application of PAT.

Raw material controls, ID, assay uniformity, some critical physical parameters, like particle size of an excipient. If it is critical for the product, you know, can you apply PAT?

In-process controls for drug products, for example, granulation end-point, most of us are familiar with that. Moisture content in the granulation; blend uniformity, content uniformity of the dosage form. In case of semisolids, maybe, viscosity measurements.

So those are the examples, and I believe the correlation between the conventional in-process controls and PAT-based IPCs will be very important, as we move forward with this concept. And, again, when can we and how can we use parametric release for dosage forms when we apply PAT?

Guidance development for PA-PAT-based controls from CMC review point of view, we need to establish equivalents to conventional controls. How do we establish to a comparative protocol in an NDA or post-approval, how do we do that? And that's where we need, I believe, some sort of SUPAC-type guidance as we move forward with this technology.

Enhanced assurance that the product will meet what I call SIPPQ-strength identity, purity, potency and quality? How to show those, how to establish SIPPQ.

Scientific basis for PAT controls, we are to justify the PAT controls. And then, obviously, as I said earlier, the type and amount of CMC information required, you know, how many batches you need, is 10 batches sufficient; 5 batches sufficient to apply or make this change post-approval? And the scale of the batches. Does it have to be at commercial scale or pilot scale, of lab scale?

Statistical support--what kind of statistics will be required to support such a change.

Stability requirements, is there any value of doing some stability requirement when you make this type of change from conventional in-process controls to PAT-based?

Post-approval commitments--any post-approval commitments, like, long-term monitoring of the process, in forming more data, you know, after the change is approved?

And the regulatory submission type, Jeff talked about. Could it be--can you do it through annual reports? Are changes being effected in zero days or 30 days or prior-approval supplements?

On the--I should back off for a second. Okay, I'll--before I go to summary and conclusion, I have a slide to show on the compliance side the industry will be looking from the agency that when you make post-approval changes, going from conventional in-process controls to PAT-based, how is the auditing system will work? The compliance audits of the sites? Is the change done appropriately? What kind of things will be checked? What kind of statistical data will be checked? So, those types of guidances we'll need on the compliance side.

On the summary and conclusion: In my opinion, PA-PAT application is easier for original application with PAT, it's very obvious. But when you go from a product with no PA-PAT, it will be more difficult to apply.

As I said earlier, difficult for original application with conventional controls, it's very obvious.

Proof of equivalence and enhancements--industry will have to show and agency will have to accept that when do you show or how do you show the equivalence between what you have and what you will be changing to?

Validation, you know, proof is in the validation. When you make a change like this, how do you validate, you know, what kind of validation protocols will be required?

How to deal with out of specifications? Rule of compliance, that's very critical for this type of activity to go forward. And incentive for the industry, cost benefit. As Jeff said, industry is under tremendous pressure to bring new products fast and, again, we always analyze, what is the cost and benefit. If we change post-approval to PAT, what's the benefit to the company? Can we reduce, you know, our OOSs? That will be a big benefit for us. From compliance side, if you can make it easier, that will be a big benefit for us.

And then training of industry as well as FDA staff, that's very important, our training on both sides of the equation as we move forward with this. And I welcome FDA's--this important initiative, which is, as I said, you know, the industry is really behind in the technology, when it comes--when you compare with food industry or other industries. And I think this type of technology is badly needed. Thank you.

DR. LAYLOFF: Thank you very much, Dhiren for bringing us more about the complexity of it. I think FDA's going to have it's job cut out for it. And now we'll--

DR. MILLER: I'd like to make a comment.

DR. LAYLOFF: Okay, sure.

DR. MILLER: And I appreciate very much your discussion and coming to your summary. We have heard through external organizations, such as CAMP and other external comments about the care and sensitivity of just focusing on making the guidance and the regulations simple and easy for original PATs. The concern, from these organizations and other discussions are of current product processes in place, to have sensors applied to them and technologies applied to them will require more efforts by vendor companies. If we go down a pathway of just selecting easy regulations or very open and general regulations for original PATs, vendor companies feel that there will not be enough activity or action for them to stimulate their companies to advance technologies to meet current needs and future needs and I--Eva's nodding her head across the way. I just want to bring that to the forum here. Please be very careful about how we give the guidance or how we make the guidance for the future.

If it is so very narrow, we will not have the external technological industries wanting to be partnering, it will be too small a business. And they will not want to waste their resources or time. Just need to get that on the record. Thank you.

DR. LAYLOFF: Yeah, it has to be a win/win for the vendors, too. I mean, if it's not win/win for the vendors, it's not going to work out either. Any other comments before we go on?

[No response.]

DR. LAYLOFF: Okay, going on to Hank Avallone, another FDA alumni--alumnus.

[Brief pause.]

MR. AVALLONE: While that's happening, I just wanted to sort of share with the committee the parts--at the first meeting, we had discussion that the subcommittee and the working groups are essentially more technical and focused individuals and the regulatory affairs seem to have been missing in that discussion and that was the reason I invited Jeff and others to sort of give that perspective so that the committee understands the challenges and so that we can craft a way forward addressing those challenges.

DR. LAYLOFF: I think that, earlier, looking just at technologies doesn't really address how it's going to fit into the win/win situation and even bringing in the vendors, it has to come there also. Are we ready now.

MR. AVALLONE: Yeah, I think we're ready to go, Tom. Thanks. I just want to start it out by thanking the subcommittee for inviting me here to make this little presentation.

It's to--just to give you some of my background--it's--I was in FDA for 28 years as an investigator. I looked at day-to-day problems for 28 years. I went with Johnson for the last seven years and I kind of picked up that same role. I looked at--now I don't look at day-to-day problems, I look at day-to-day opportunities.

What--some of the issues that--and this presentation is given from an operational compliance perspective. And it's a little different--maybe a little different slant on PAT and how it's going to affect our operations from an operational perspective and from a compliance perspective.

I date myself with this first comment and I think I started--well, I started in the industry in 1965, late 1965 when the GMP regulations were first--started to first evolve. And the comment that I recall that stood out that Ted Byers gave and that a number of PMA company quality managers always gave was that it's important to design quality into the product. We need good development in order to have a quality product.

And I think that has--that's the one thing that I think has stayed with us over the time. In the last 10 years or so, FDA has become more involved in this in looking at the development aspect of our products with the pre-approval inspection program. And I think we've all--have more of an awareness of the need for good development.

And just, since I have this floor here for a couple of minutes, I just had a couple general comments that I've heard some of the other speakers present.

One of the issues that I think we all have to understand is that the biggest--the major compliance issue is old products at today's standards. The bar is constantly being raised. It's not going to stay, it's not going to stay where it's at. Day to day it's going to move up. From an industry perspective, this offers me some type of competitive advantage over other companies so I'm going to look at it from that aspect, also. I think we need to recognize that PAT is just one part, one of the drivers for improved product.

There are a number of drivers for a quality product and when we look at this and it's something that our development managers need to be aware of and I constantly remind them of this on a daily basis when I look at the old products and I look at these opportunities as they arise in my company.

The first one is really operational environmental exposure. We're getting more and more pressure to have concern for the operators, to minimize--so that they'll have minimal exposure, minimum toxicity coming from the products which they work with on a day-to-day basis. When I develop a product and I have to look at this and look at manufacturing processes and systems and procedures that will give me this minimal exposure of operators.

Another area that we look at, another driver, would be the manufacturing technology and this is improving all the time as equipment is evolving, new, better equipment's evolving, better testing is evolving. We should look at raw materials. And many of the raw materials that we use are purchased as open materials, have fair-trade raw materials. And I think it's important for us to develop specifications that are tight enough that will give us the consistency and standardization for process. And we'll talk about this in a few minutes. Also, the API, it's critical to look at this aspect of it, in terms of the physical form of it and standardize and control that.

And I mentioned equipment, really we're looking at equipment that's closed and that's cleanable. I have to turn around equipment that, from an operating company's perspective in a short period of time. I have to be able to gear up from one product to another. So some of the large, process trains clumsy equipment that I have, I think I'm going to have to take a look at when I develop products.

Basically, I've given the charge and I had a meeting the other day with the VP of R&D. And my charge has been, since I came to Johnson, I want direct compression products, I don't want any wet processes, I want to keep it simple and that's the theme you're going to see with this presentation.

Operating costs, again, minimal steps, keep it basic. That's going to give me the benefit in terms of day-to-day operation. I mentioned the cleanability of the equipment. My cycle times are going to be reduced. I'm going to have to turn the product over, turn this equipment over as many times as I can.

Improvements in analytical technology and we're talking about here at this forum PAT, but this is coming through as just one of the improvements in analytical technology. I'm finding more about my existing products and, certainly, when I come up with a new product, I'm going to have to look at it a little closer because I know this product is going to have to withstand increased scrutiny over a period of time.

I'm going to see flaws in my existing processes, products and I see them. And I see them they come out in stability testing. I see them on a day-to-day basis.

And the other piece here that we have to look at in development is nonconformances and documentation review. Again, the more basic the process, the simpler the process, the less opportunities I'm going to have for nonconformances, the less opportunity an operator's going to have to do something wrong and the list of mistake I've going to have, so it's going to improve my compliance level by having a product that's developed to a standard--to today's standard.

Certainly, when we talk about compliance issues that my real concerns are dose uniformity, dissolution, and impurities. I think PAT is moving forward, it's going to address the dose uniformity issue relatively well and the impurity piece will tie along with that. The more difficult piece is the dissolution piece and this is release-rate piece. And I think this is the aspect that we're going to talk about when we get into raw materials and why it's important to have a simple process, few raw materials and that I have control of the distribution of these raw materials.

With regard to the API. The physical form is important and it's important for the developer of the API to communicate with the developer of the dosage form. Two days ago, again, I met with the R&D person and he commented that we have a new product coming down the line and that the API developer has given him four different physical forms of the API for him to work with in developing a directly compressible product. And this is necessary to go that way. I think the days of taking a raw material--I get what I get out of the crystallization process and I just mill the hell out of it and I get a nice micronized particle or reduced particle size and I can go ahead and manufacture. I think those days are coming to a--I think they're coming to an end when we start looking at formulation development.

From a GMP, a validation perspective, this probably--the physical form of the API, prior to milling, probably gives me the best indicator of the control and the consistency I have in the manufacturing process for the API. So, I want to establish a specification--a meaningful specification for this material at that particular stage and I may even be able to get by without a micronization process or an extensive milling process of the API I'm manufacturing if I'm able to put more control into that aspect of it.

With regard to excipients, again, I want to keep the number short, I want--I want physical aspects monitored, good specifications for these physical aspects of the excipients. And one of the concerns that I have now when I develop a product is the excipient uniformity. It's not just the active uniformity, but I want to know, for example, what's the distribution of my stearate in this particular product? I think another presenter commented on that excipient and it certainly does have a major effect, kind of a major effect on my release rate. So I want to make sure that I have a process that gives me the right uniformity of that and the right characterization, particle size and control of the excipient and the API.

From an operations perspective, I'd like to have multiple sources and one grade of excipient. I know our developing managers sometime like to get somewhat novel and go to a single-source excipient that may be in some, you know, location that's maybe out of the United States and this does present problems from an operating company's perspective. Again, I want to keep the excipients relatively common ones that probably have multiple sources on them.

The ideal process, from my perspective, is the direct compression process screen, blend, and compress. And this enables me to have a closed system. Toad system--I can weigh, bled and load the press directly from a container. When we look at some of the existing systems and I know when one of the--I guess one of the issues that I was concerned about when I first came to Johnson was, in J&J we have a lot of fluid bed processes and in my travels throughout the industry and in the New Jersey, Philadelphia, New York area, I never really saw much of fluid bed processes and I think probably because of the competition but, also, because of the recognition that when I look at a fluid bed process, I'm looking at a very complex process. And from a compliance end, this, again, presents a lot of opportunities for me to have nonconformances.

Going back, I guess historically in days of training FDA investigators, one of the things you point out is when you see a piece of equipment and you see a lot of dials on it, you ask the company, what do all these dials do, what kind of controls do they have around them? And now with increased computerization, we start getting more printouts of alarms, alerts, things like that and so I want to cut these--this number down and this complex--relatively complex equipment is going to increase my process time.

I recognize there may be some processes that this is needed for but, again, when I look at when I look at development today, my first choice is direct compression, simple processes and, again, that ties with PAT, with the analytical aspect of it from a dissolution and a constant uniformity perspective.

In discussing cleanable closed systems, we're looking at wash-in-place tablet presses, also, where, again, I have the minimal operator exposure. I have the good cycle time, I can turn it over relatively efficiently so I can move forward in that area.

As I point out, PAT, along with any other analytical--new analytical technology is going to identify flaws in my process if the process is not properly developed. So I think the--one of the messages that I've taken away with PAT is I have to have a well-defined, a well developed process that's consistent, otherwise PAT is going to show me the flaws in my process.

And this is another comment on the direct compression piece of it. Again, less variables, less steps, less opportunities for nonconformances and that's where I'm looking at it from a compliance aspect.

One of the concerns in cycle time in manufacturing is the documentation review. I can move forward PAT and improve my cycle time, my processing times, not stop the process, but what stops the process is, really the documentation piece, the review of records, the nonconformances, the problems that occur in the manufacturing process. So, if I move forward with that I think I can tie in with PAT and have the process that's properly developed that's going to be consistent and from an operational perspective, that I'm going to be satisfied with.

In--just to wrap this up, in conclusion I've talked about the development from an operational and compliance end and hopefully this ties in with what you're addressing in your areas of PAT. Thank you.

DR. LAYLOFF: Thank you. Any questions for Hank, comments? One comment, Hank, I've hung around this business for probably about as long as you have and I think the bar for solid-dosage forms hasn't changed much, content uniformity's pretty much the same over the period of time, dissolution, after we once put it in place, has been pretty constant over time. But what has changed, I think, is excipients in APIs. I'm reminded that I was looking into sucrose one time because I was fascinated with the proposed change in a monograph, and I contacted one of the guys over at food chemical codex to find that they had changed the lead limit on sucrose. And I said, was that because of some eminent health hazard associated with using sucrose and tablets that didn't occur in soda pop? And he said, no, it was technically feasible.

And it seem like, in the case of raising the bar that the bar has been raising up on excipients and APIs to what is technologically feasible, but since our statistical sampling is absurd, we've let the other bars stay about the same. Anyhow, thank you, very much.

DR. MORRIS: Tom, could I ask a question of Hank?

DR. LAYLOFF: Sure.

DR. MORRIS: Hank, I wanted to clarify, you made the comment that you must have a well-defined controlled process or PAT will show flaws, is that?

MR. AVALLONE: Yes, I think it--yes, I think when you look at large numbers of tablets, you're going to see issues if the process isn't well developed and uniform and consistent and we see that then in the--one thing I didn't mention, you know, with the analytical technology to kind of give Tom a plug. If Tom was probably in the St. Louis laboratory right now, you'd have--you'd probably have your products tested using this technology right now. Is that a fair statement, Tom? Right. So, I think that in looking at this technology, as we move forward, we're going to find problems with processes and I've seen it with some of my products, now--I'll give you an example.

I looked at some of my annual reports for a product over a year's period of time. And I look at my content uniformity of this product. And it ranges from about 96 to 104, real good, tight content uniformity. But every now and then, I get a 62, right. I'm looking at this thing. And the question is, right now, you know, and now Joe gives me one or two a year that I can retest. I get one or two of these. And when I look at this, I have to take a step back and say, is this a real number or is this just analytical error. And it's a difficult call to make right now. But, again, if I look at over a year's period of time and I'll look at 40 batches. And I'll have one batch that comes in or two batches that come in with a 62 out of it.

All right, I think if you looked at PAT for this product over--with--for individual batches, if you looked at 10,000 tablets, you've really done or looked at--or even more--you're going to find, possibly, one or two of these tablets in that batch. And with the testing that I do now is destructive, so it becomes a little bit more difficult question, is it analytical or not? But with this technology, you're going to have that tablet, you're not going to destroy it. And you're going to know is it a real number or not and you're going to have to deal with it. And I think that the issue is going to be, if you don't have the good manufacturing, the consistent manufacturing process, you're going to have problems in this area.

You're going to find things out that you didn't want to know you had--you knew.

MR. COOLEY: Could I throw out as maybe a challenge to the group, that by using PAT, it may be a way of getting to better process control and better process understand rather than meaning that we have to have better defined processes to make sure PAT never shows up a flaw?

MR. AVALLONE: Well, I think they work--and that's maybe I didn't get that point across, but I think you really need the--you need the two, I mean, you--I can't just say I'm going to go to PAT if I don't have a process that's really--that's well developed and it's consistent and it's uniform.

MR. COOLEY: But couldn't PAT get you to that if you use PAT in the development stage, don't you feel you could get to that stage much faster by having more data?

MR. AVALLONE: As Jeff pointed out, I think where you're getting to is--I don't know that--as the process moves along--it's moving along pretty quick and I think you're probably going to--PAT is probably going to come in once you get into the operational stage rather than in a development stage.

I gave you the example, that I have a product now that's going now into probably phase three, and my API supplier, research guys in API gave me four forms of it. And we're looking at different--at three or four excipients to manufacture a direct compressible product. So, I want to get, if I get--put everything together and I get a product that's consistent, that's well defined, and well developed, when I put this in my operating plant, then I'll be able to utilize PAT, it'd be a good--it's going to be a good candidate, hopefully for PAT technology. Not at the development stage, though, I think it's too early in that stage because I'm still working with the product and I'm going into, you know, I'm going into trials with this particular product, so I need to define it now and maybe at a later point in time kick in the PAT.

DR. MORRIS: Tom, could I raise a point as well? Two things, one, Hank, is that you may find flaws in your process that that's of course the case, but I think part of the charge of the committee and part of the reason that we're all here is that if we find flaws in a process that's one thing, we don't want to find flaws in the process that are really because the sensors haven't been properly applied. And that's sort of more of this, I hate to say safe harbor, but that's sort of more of the concept of saying, during the period when you are applying them that you don't artifactually develop data that makes it look like there are flaws that really are just a function of the fact that the implementation isn't really done, just as a point of clarification.

The other point is that with respect to development batches, we've pretty well, I don't know how many batches of things we've run over the years, but the idea that we've embraced as much as possible is actually another one that comes from Father Tom here, which is that PCCPs are what are important. The value may change, as you scale, the value may change as you change process conditions, but if you truly have identified a critical point that needs to be monitored, the fact that that's the variable that needs to be monitored doesn't change. The absolute value may change, so thereby, I would say there's significant advantage to doing it during process development, during clinical manufacturing, all along the way. Again, once we're--we'll have to think of a better term, but once we're through the point of making sure that we properly applied the technology.

MR. WALTERS: I just had a comment. I feel that if you apply PAT to any properly developed process today, you will find some variability which may not necessarily mean flaws in your process or your product.

MR. AVALLONE: I don't know that, again, I gave you the example, I don't know that I would agree with that. By today's standard, if I have a well-defined process that's very consistent that gives me good reproducibility, then probably it is a--it could be a candidate for PAT. I think that the issue that's going to come up, I think that I'm struggling with is the release rate in the dissolution piece of it.

I think the technology is probably moving there and I'm not, maybe, the expert on this, but I think from an activity perspective--a dose uniformity perspective, I think we're getting there, but I think the other piece to demonstrate the uniformity of the excipient in the product is maybe not there. I think that's the tougher--the tougher issue that PAT is going to have to, you know, to deal with is the release rate and dissolution. And that's why I'm looking at--the ideal candidate would be a directly compressible product, few excipients, few variability, uniformity of the excipient so I can control that aspect of it with technology.

DR. LAYLOFF: One of the things we've been talking--we discussed previously was that most of our process stream has been monitored through the API all along and the excipients have sort of been ignored. They've just sort of been hung along with it, which, of course give you the problem and if you start looking at dissolution properties, because you're not monitoring the whole--all of the materials in the blend, you're just looking a that a single component of it, which gives you a warped view of things.

The other think I'd say on an outlier--you have a nice population if you find one out there. When you have analysts in the laboratory doing routine analysis that--and you're doing it destructively, it's very frustrating. I mean, I had an analyst go back and run a tablet--a bottle of 1,000 to try and find another 50 percent tablet. And it was probably the analyst error that cost me two or three weeks of your tax money.

MR. HALE: Okay, I think that there's been a lot of talk about using PAT to look at existing products with existing processes as opposed--I think where our real opportunity is is to use the testing capabilities to design processes that are inherently scalable, that are inherently measurable, and that are inherently controllable, which does not always exist with current technology. So I think if we limit ourselves to thinking in terms of how we measure things with the existing scope, we will be limiting this whole process. Where the big gains are, I believe, is not measuring more things but allowing the measurement to allow better design and it has to go back into development to get the optimum advantage to this process. If we limit ourselves to batch processes, if we limit ourselves to specifications that were built around old existing products and processes that we will not allow ourselves to create the advantage that we could here. And that it has to be in development where--and the development of these products and processes that meet the new criteria and not constraining ourselves to the way we've done it in the past that's the static blending systems, the granulation and all of these things don't necessarily apply anymore and we need to be able to do things that we haven't done in the past.

DR. LAYLOFF: Art.

DR. KIBBE: I think, Tom, hit on a good point. I was going to try to get there, but I'll skip that one and go to my next point. We have had, during the evolution of pharmaceutical manufacturing continually improved our ability to analyze what we do. And this is one more step and it's not anymore frightening than any other step we've taken.

Remember when we couldn't measure penicillin down to the amounts that we can now and we've added a whole bunch of process to make sure there's no penicillin contamination. Well we would have never have done that if we couldn't measure penicillin to our levels.

And the invention of pharmacokinetics was because we actually started to be able to measure the drug in the blood supply so we could actually make some measurements. So this is no more, I don't know, it's an evolutionary process, not a revolutionary process. And if we take that in mind and we say to ourselves, what are the standards that we need to have to assure safety and efficacy in the patient and if we can monitor 100 times better than that, that doesn't mean we need to change our standards. And I think companies don't need to be afraid of the fact that we're going to look for a 5 percent variation from a tool now that measures 1-1-millionth of a power variation that we had before.

I think Tom's right. This is an opportunity for the industry to come up with way of improving the process so they can save time, save money, reduce batch failures and out of specifications. And know when the process is starting to go, long before it gets out of the specs it's needed to get it approved for use in humans, so they can make those changes in those things.

Tom also said about the odd numbers from analytical. Well, if you have a nondestructive method--the nondestructive method can be validated against a destructive method, you can go back and look at it again.

I mean, I look forward to the day when every tablet that comes out of the line has been scanned and we get a uniformity indicator, maybe a fingerprinting, as we talked about before, that gives you a sense that there is, indeed, the right mixture of all the excipients and the active in it and you can see during the run that this moves slightly. But it moves within a constrained environment, because the run is not absolutely perfect. But we accept that because we know that the variation in it is not significant clinically in the end line as the clinical variation.

So, I think if we can assist the FDA in writing guidelines that makes that clear. And the industry looks at it as an opportunity to save money, to have a better control process, to be more sure of their product that they make, I think it's going to be, you know--work out well. And I think we can do that.

Now, one of the things that we're doing today is focusing on solid-dosage forms. It might be useful for us, I think in the long run to focus on that as we develop the guidance and then allow it to expand to things other than the oral solid-doses form, which seem to me a priori to be a little bit easier to handle in most cases.

DR. SHEK: Tom, I want to just re-emphasize what the thing Tom was talking about and what, Frank, you had--Frank, you had in the first light, okay. Talking about building in quality into the product.

I think we have to look, PAT is another analytical tool and PAT wouldn't make the product better, it maybe become more efficient, you know, the way to test it--a better test, but opportunity, I absolutely agree and I was a little bit disheartened to hear that you know, Ajaz, from you, evaluations within the industry are, indeed, people are a little bit reluctant to look at that. I think that's a great opportunity there basically to build the quality into the product understanding the process.

And I would like to push it further, it can be also product--existing product that a company decided they'd like to improve the process. I think here there is an opportunity to use PAT and maybe with collaboration with the regulatory agency to facilitate this change, because here we'll have data where we can really use--and I think that's--that's where, really is the game--we can this way to improve the quality of the product that we have today and make it more efficient and effective.

DR. SHAH: Just a comment on what Hank said. I my experience, less than 10 percent of solid oral dosage forms are manufactured by direct compression. You know, most of the products are manufactured by the conventional wet granulation process. The dose may be too high, the solubility may be too low, whatever the reason, but the majority of the products end up going through wet granulation process.

Dr. BOEHLERT: I just wanted to make one other comment. We've been talking about using PAT and learning things about your process you wish you didn't know. But, in fact, some of those concerns are happening today as manufacturers go back and look at old products with new technologies.

My experience relates most often to the laboratory kinds of issues and if you look at an old product with a new method, you may, indeed, find things that you didn't know. It may, indeed, not meet the requirements that you have on file, but in the end, what's going to be important is whether there's any impact on safety or efficacy of that product. The product itself may not have changed. It may have always been the way it is now. What has changed is the way that you look at it and we need to keep things in perspective, you know, have safety and efficacy been impacted, or do you just know more now about the product that's always been out there?

MR. HUSSAIN: Tom, sort of two comments. One is, I think with respect to a lot of the regulatory risks that you want to deal with. I mean, we have posed for you a set of questions, if you could sort of go through those questions, I think this discussion really fits in very well with that. And that's the reason I asked you to sort of move the training discussion to the afternoon.

But the point I want to make is in the sense, I think we all believe that, you know, we have to build the best product and so forth. An my concern, I think, just listening to the discussion here is the pressures on R&D seems to be quite significant to just, you know, move forward. And my concern is in the sense in many ways if proper care is not taken you actually risk losing your clinical database itself. Because the products that you use for clinical testing really have to be good quality, too.

And so the trends have been in the sense to go with delaying formulation development as late as possible because of the high failure rate in clinic. And that's the reason I said the manufacturing problems that we see--yes, many old products do experience that but more and more the newer products are having manufacturing difficulties, too.

So I think the reluctance, and Efraim pointed out in a sense, what I have heard from many people in R&D side is, in a sense, we don't have the time to deal with this, so don't sort of bother with it. And so I think how will we turn that around, I think is through time and through education and so forth, because a lot of the formulation development activities an the people who do that may not be aware of these technologies and how it can help them develop a better product. So, with time that will come around.

DR. LAYLOFF: A couple more comments. I think with respect to time, I think at some point we will see a formulation driven in part by the technologies that you're using to assess it. That you might have surrogates to assess product quality. So you actually look into the product design by the technologies you're going to assess it with.

But I don't think that PAT is going to bring to the industry, the revolution that came when we went from spectrophotometric methods to chromatographic methods. I mean, you talk about opening Pandora's Box, that did it big time. I don't think the current path on process control will impact what we do as much as chromatographic procedures did.

I think with that--oh--

DR. MILLER: So, Tom, just to concur and substantiate Ajaz's last point, I gave a presentation to the Philadelphia Pharmaceutical Forum on May 9 to about 75 people about PAT and all of our activities, past and present. And probably there were 60 formulators and developers from more than a dozen companies and this was absolutely new to them. PAT is new to formulation and development personnel in general. They have caught a couple buzzwords through, you know, with they have read or heard, but it, in general, in May of 2002 in the Philadelphia area to a dozen firms, the sense is--my senses were that this was new and they have not had the opportunity in the past to use sensor technology in the formulation area.

I'm not speaking to chem development people where APIs are routinely monitored by sensor technology, but formulation developers, it was very clear that this is new terminology, new thinking and they will have to be trained up and deal with it.

DR. KOCH: Tom, if I could make a comment that's relative to things going on in other industries, the last 10 years has evolved from what was an analytical profile, in terms of acceptance of raw materials and final products to often a performance-based forum for deciding on whether to accept products, et cetera.

So in many industries, things have been changing. The use of PAT in those industries has change the way analytical is being done, often much more predictive and inferential analysis is showing up. So I think the type of things that we're seeing here, in terms of a trend are consistent with what's been happening in other places and will only be of a benefit, long term, to this industry.

DR. HUSSAIN: Tom, sort of a request, in a sense if you could consider sort of structuring the next part of the discussion on the questions that were posed and go through that for the rest of the--

DR. LAYLOFF: After the break.

DR. HUSSAIN: Okay.

DR. LAYLOFF: Before we take a--we're going to take a break very shortly, but before I do, I wanted to point out to you that the Process Analytical Technology initiative has been posted on the dockets for your comments. So if you go to docket--go to www.fda.gov/dockets to number 02D-0257. That was recently posted up, again, it's www.fda.gov/dockets and it's number 02D-0257.

And with that, it's in the back of the handout on all your handouts at the table. And with that we'll take a break for 15 minutes.

[Morning Break.]

DR. LAYLOFF: Okay, attached in your handout is a series of questions, which have been posed to us by the FDA. And I'd like to have us address those at this time. Ajaz, would you like to go over the--read the questions?

DR. HUSSAIN: I hoped you would that.

DR. LAYLOFF: Okay, I'll read the questions.

Question one, that's a good beginning--question 1: How would the committee articulate its shared vision of pharmaceutical manufacturing and CQC/A using PAT? Hasn't been met with enthusiasm.

MR. COOLEY: Tom, one question I have on that is maybe Ajaz could kind of expand on what you were looking for on that? It wasn't real clear to me what you were asking for--is it a mechanism, you know, going out on a road show or exactly what did you mean by that?

DR. HUSSAIN: Well, let me, maybe I should go back and--one of the aspects, I think, which we think is important is to clearly define what we mean by PAT in the sense--from a regulatory perspective as we start developing a guidance and so forth, and essentially what I've asked is ensuring a proper definition of PAT is important for the purpose of developing regulatory policies and procedures. The definition would need to be sort of sufficiently broad to help the public and industry realize the benefits of the shared vision of PAT, yet be specific to draw distinction between the PAT concept of continuous quality control or assurance and the current approach that emphasizes lab-based testing to document quality. In a sense, what does the--what I was hoping to get some sort of dialogue from the committee is how the committee articulates its vision for pharmaceutical manufacturing and the continuous quality assurance paradigm under PAT.

In a sense, are we on the same page in terms of PAT being a tool to understand your processes to a degree that essentially says end product testing is either unnecessary or minimal or what and that sort of a thing. Because once we use that, sort of discussion, then we could actually want to discuss the difference between the traditional parametric release and the PAT-based continuous quality assurance and should we draw that distinction or not. I want some discussion on that topic.

DR. MORRIS: Can I, just--one point that might be worth considering is that there's really sort of two ways of applying this, I mean, in general, and all in-between. But one is that you follow a process and monitor its progress and if it starts to vary, then you know it's varied and you have, maybe, an assignable cause or something to look back on.

The other is that you use the feedback from whatever technologies you're using to control the process, which is quite a different set of circumstances. So I don't know if that needs to be encompassed in the overall articulation. But, certainly, something as a subcommittee we need to address. And, certainly, in terms of what it would mean in terms of shifting mentalities for the regulatory side.

MR. HALE: I think to expand on that--there are not only control of the process but there are the ways of communicating between industry and FTAs in the specifications and how are you going to release product and that seems to be the fear here. But, as this--as the processes develop, there are multiple ways to release product, whether it's by testing physical product, properties of set sample, or releasing based on immediate measurement of a particular dosage forms, those specifications will be different. So that needs to be added, I think, to this, too.

MR. CHISHOLM: Yeah, I think , that I could try and give you very briefly a summary of what the AstraZeneca vision for want of a better word is. And I think, for new products, which is where we're focusing from now on, it starts, actually, in formulation design and it needs to go that far back. That doesn't mean that you can't apply this to existing products, you can, quite successfully. But if companies are going to look far ahead, then their own executive directors have to get this accepted both by the pharmaceutical side of things as well as the operational side of things. And that's where you get the true benefit.

So, firstly, it's about formulation design. Secondly, it's then, having got that design, it's about technology transfer, because it helps you with that technology transfer. And I think that's why the last time you used the word--let's include the word continuous, as well as batch processes to enable the technology transfer in a much easier way.

It then has come down, next, after that in your manufacturing process to real-time, statistically-based quality monitoring. What you're actually doing is statistical process control. And if you think of the thing we're thinking about, which is a tableting process, that's right up to and including blending before you go into the tablet press.

Once you get into the tablet press, there's not a lot you can do if you haven't got the previous batch right. What you have to then, for our friends in regulatory, of course, is to do the old-time quality assurance. So you statistically monitor tablets--statistically based, like all other process industries across a batch.

So, I think that's the vision we have of the starts, basically in original design and goes all the way through to real-time quality assurance. I think that's what the FDA is really thinking about with the term they're now using, the term parametric release, I think's totally unsuitable for this because it's about process and product, not just about process and I think the thought about parametric release in the past has always been more about process. So that's where I would be coming from on it.

Dr. RUDD: Yeah, it's interesting, I think, we starting to get a bit of clarity on the, let's say the differences or the difference in priority which seems to exist at the various stages of development and manufacture where PATs can be used.

I think in terms of any definition and terminology, what we have to get clear, we've said all along--the quality-by-design concept is what we're interested in. I think it, therefore, is crucial that we think about PATs first and foremost as a development aid--the process understanding, the process signature, the process characterization comes from the use of PATs in development. That will be limited, for reasons we've heard already--aggressive time scales; lack of materials; lack of variation in materials; all of these are constraints in development. But you can get so far.

You can begin to build a picture; build a model, start to get some understanding of the process. At that point, you then have the transferability of whatever technology you've developed at that stage and you continue to refine the model. You need to use larger-scale information, greater batch numbers, manufacturing-based information to refine the model that you've developed in the development phase.

And the PATs will be used there, but differently, subtly differently from how you've used them in development.

And you then get onto the--what you might call the routine use of PATs, where the process understanding bit is almost gone, you know, it's too late to worry about that. And I think what then ensues, as Bob and one or two others have said, you're then into the use of PATs as an enhanced form of statistical process control. If the process is in control, if it isn't varying at all, that's fine. But if there is subtle variation and if there's gross variation, the PATs can help you bring it all back in again, the feedback approach that Ken has talked about.

So what you've got there is, although the enabling tools are maybe the same all the way through, you've got different prioritization, different drivers, depending on which part of the business you're in. And the definition and the terminology will need to reflect that. We're not talking about a single label PAT that applies to all of those situations.

DR. HUSSAIN: That helps in a sense because David had presented sort of his vision at the first meeting and so did Bob. And Pfizer had its presentation of their vision of PAT and it was just sort of very similar, right at the first time and so forth. So I think what David just sort of outlined as the hierarchical aspects of PAT in different uses, I think. So that's what the definition and the use of the term should truly reflect.

DR. LAYLOFF: Okay, we are going to question number 2: Define CQC/A. Should CQC/A be distinguished from parametric release?

DR. HUSSAIN: The whole concept, I was struggling with the term because I think the camp folks have used CQV and they put a trademark on it, so I said I can't use that term so--[laughs]

DR. MORRIS: I'm sure they'll license it to you.

DR. HUSSAIN: So, I didn't want to use that but the whole concept there simply meaning that you're controlling your processes, the feedback and what not, so that at the end of the production cycle essentially you're done, you don't have to wait for the lab to pass that back, and so. So that's essentially what we're trying to sort of define, there.

DR. LAYLOFF: Anything else on that? I think we've already separated it, I think.

DR. SHABUSHNIG: Just one point of clarification from your talk earlier, Ajaz. With the sort of concentric rings of overlapping systems, and I agree with that model what you're showing there, but what you're saying, if I interpret it correctly, is that it may be appropriate, if you are missing some data in one of those areas, that you still have sufficient information to release the lot--to judge lot quality based on information that you have from other systems. Is that correct?

DR. HUSSAIN: Right, I mean, I think a measurement or a sensor would be part of the system not the whole thing, definitely. And so, the built-in redundancy and so forth, would essentially define that the system is adequate to do a continuous verification of your quality so that lab-based testing in some cases may not be necessary for release.

DR. SHABUSHNIG: But it's not just redundant sensors, it's really that you have other kinds of information that is still sufficient to--

DR. HUSSAIN: Correct.

DR. SHABUSHNIG: --assess the quality of the overall batch?

DR. HUSSAIN: Correct.

DR. SHABUSHNIG: Okay.

DR. HUSSAIN: Sort of to elaborate on that, let's suppose you are looking at blending as a unit operation. You, in your development, have identified a blend time and have developed an SOP. Now, the SOP requires a operator to load the powder materials in a certain order. And so, if you have an online sensor to assess blend homogeneity, it actually is very fine that the SOP was carried out correctly and so forth. So that--its use could be verification of that step and probably build or collect information for the next step, maybe link it to dissolution. So, that's how we would sort of view that.

DR. SHABUSHNIG: Thank you.

DR. HUSSAIN: I think the distinction between that--this concept and parametric release, I think Bob already, sort of alluded to his thoughts on that and that reason that I sort of put this on is, I think, we are moving towards some discussion which we may be on a different page with Europe--the European agency, with respect to parametric release, so this would help us, in a sense, formulate our thoughts on, is parametric release very different from the CQV or whatever that concept is?

DR. LAYLOFF: I think the--and, certainly, in the United States, it would be very different, because in the parametric release, the product itself that's being released has never had a measurement made. So it's a really a leap of faith based on your measurements on a surrogate that allows you to go forward and this is not anywhere near that.

DR. RUDD: Yeah, if I could just comment on the European situation. It's fairly timely because Ajaz referred earlier to the CGMP EMEA guidance on parametric release, which appeared, I think, during the end of last year.

There has been a small working party commissioned by CPMP charged with the task of providing more extended guidance. So it's industry providing some input now to CPMP to maybe to close the gap a little bit. And a number of us, potentially from AstraZeneca, Pfizer and GlaxoSmithKline, in the U.K., have recently developed some guidance which has actually been presented to CPMP today. I think it was inappropriate to circulate that draft document to this group before CPMP saw it, but I'm more then happy to try and do that immediately afterwards.

It is narrowing the gap. It does reflect very much the quality-by-design concept. The parametric release term, which I think has been a bit of an albatross for a number of years, because it is historical and does mean a number of different things to different people. The proposal is that that's being replaced with the term real-time release and the document very much develops the quality-by-design concepts. And I think it does--it does close the gap, as I said, between the position I think this committee's at.

But it does also provide an extra piece of information which I think could be very useful to consider here and that is some proposals which clarify the relationship that could exist between a process-based measurement and the end product quality attribute, that might be predicted by that process measurement. So, to give an example, I mean, despite what PQRI might tell us, I believe intrinsically that there's a relationship between powder-blend uniformity and tablet-content uniformity. It just seems intuitively right to me. So that's a nice one.

Similarly, you can make a relationship between powder-blend assay and finished-product assay. And the document attempts to derive other relationships. So, you know, what combination of measurements could you make which might be predictive of dissolution testing, for example. I think that's a very useful point and I think any guidance that we eventually develop would be well advised to try and address that same point. Maybe not in the same way, but not to leave that point untouched. I think the gap's closing, that's the import thing.

DR. HUSSAIN: The historical sort of baggage with the term parametric release, I think I'm very pleased to hear that at least they're moving away from that because parametric release, I think, Tom, in your presentation--in the recent meeting that we were together--in essence, creates a scenario where I think confidence is not there. So even when you have parametric release for parentals, people just do the test anyway for the fear of lethal concentrations and so forth.

So moving towards more science-based measurements, I think, sort of alleviates some of those concerns associated with parametric so.

DR. LAYLOFF: I think in our earlier discussions, too, there's no intent to abandon all testing and stability testing would be there and things like that. It's a different ball game. I think Questions 2c, I don't think we need to address. Going on to Question 3: Does the Subcommittee wish to refine or modify the working definition f PAT proposed at its first meeting in February? If so, how should this be modified?

DR. HUSSAIN: The definition that came out of the--by the benefits working group, was, essentially systems for analysis and control of manufacturing processes based on timely measurement during processing of critical quality parameters and performance attributes of raw and in-process materials and processes to ensure acceptable end-product quality at the completion of the process. That was the proposed definition by the group and I think, keeping some of the thoughts in mind what David sort of summarized them so that different aspects of PAT in different arenas of development and so forth. Would we want to stay with some similar definition or sort of modify this?

DR. KOCH: Yeah, I think this fits very well. I think the emphasized word there in the definition is going to beg for some dynamic, timely definitions of critical. And I don't thin it needs to be in the primary definition, but I think there'll be a subset of what is critical at this time, based on technology or performance.

DR. MARK: I think there's a word coming in here which we first heard from our European friend and I'm hearing it several times. And the keyword here seems to be time. You can imagine a whole range of possible technologies in use. Some will give an answer in a second, some in a minute, some in an hour, some in four hours or whatever. And the question then becomes, well, what do we mean when we say timely? What do we mean when we say real-time? I think this is a question which sooner or later is going to have to be addressed.

DR. SHABUSHNIG: But, just to comment back--can't that be left in terms of the context of the process that's being measured? In other words, if you have a process that's a two-day process, an hour measurement periodicity may be appropriate, whereas, if you have a process that takes a minute, you need something tighter. And I'm not certain that we want to constrain ourselves in the definition at this point. I think there has to be appropriate science around what the appropriate timeliness or measurement interval should be.

DR. MARK: That may well be--that it will have to be, as Ajaz said, every new technology will have to have its own SOPs, but I think sooner or later that is going to have to be something that's going to have to be part of the definition.

DR MORRIS: Just to Mel's point, if I can for a second--oh, sorry, did I step--very briefly to critical--the word critical here. It may not be necessary, only in the sense that you may be monitoring parameters that, independently, don't constitute a critical component, but when taken in conjunction with others, give a signature as, David I think you had mentioned last time, said would be the real metric.

DR. RUDD: Yeah, if I could just come back to that point about the real time concept. And I think John's comments are exactly right. It is, obviously, process-dependent.

As an example, I think all we're really talking about with this idea of real time is the ability to--and this is very much in the manufacturing environment--is the ability to make a measurement and then do something about it, in terms of corrective action, if that's what the process needs. So it's a time frame whereby we're not just making a measurement, it's a measurement that we can react to. One example I got from the food industry in the U.K. And this is particularly important for continuous processing, a lot of their analysis is very much off-line, still laboratory-based, but with extremely rapid turnaround of measurements so that they can actually go back and correct the process or take a time slice out of the production material, if the percent was out of control, particularly. So, it's just--just really that. It's about making a measurement that you can then do something with, you can react to--feedback corrective, action, rather than just make a measurement and write it down and never do anything with it.

MR. COOLEY: One comment, David, though is, when you say it's a measurement that you react to, you know, are we limiting the application of PAT by saying that you have to react to it or you have to control something with it because, as you mentioned earlier, going back into the process into development is where there may be great benefit of PAT that the subcommittee's not really addressing.

And in that aspect, it may be just monitoring what's going on and doing no control whatsoever. So I would challenge that maybe we need to take the word control out of the definition and make it a timely measurement that lets you understand your process.

DR. RUDDER: Yeah, I mean, I did preface it by saying, in the manufacturing environment. So, yeah, simply making the assumption there, that if there is a need to make the measurement during manufacturing. And don't forget, we might well do enough in development to establish that we don't actually need to measure anything on a routine basis. But making the assumption that if we are making a critical measurement, during manufacturing, then, presumably, you want to do something about it, if the data from that measurement is not what you'd expect, hence the reactive component.

But, yeah, you're right, I prefaced it by saying in the manufacturing arena. And that's the only scenario where I think you would need to use the measurement in a reactive sense.

DR. MILLER: And that goes to what we have discussed in the past, in the previous meetings. I would like to add, for clarity of thought, I believe, two sentences are better than one long sentence. It aids in thinking and appreciating the concepts. And let me, suggest a little refinement to the point of the beginning of the first sentence and the beginning of the second sentence. I would like to see something in this order: Systems for analyzing and controlling manufacturing and delete of. This first sentence ends with the word processes. The second sentence begins with PATs--capital-P, capital-A, capital-T, small-s assure acceptable end-product quality at the completion of a pharmaceutical manufacturing process. This two sentences to me, aid in clarity and allow for, let's say bigger thinking. It separates and allows for thinking. My small suggestion, thank you.

DR. HUSSAIN: One sort of aspect, which David raised was that in development we may find, using all the technologies that some things need not be measured. So the definition in the development arena and the manufacturing arena could be slightly different. But, essentially the technology which I would also sort of ask you to consider is, would design--statistical design of experiments be part of PAT? And this was a discussion we had at the first meeting, because now, instead of doing a trial-and-error type single-factor experiment, we developed a product and we have very little information about interactions and so forth.

But now if a company opts to do a well-designed experiment, some companies do that now--and would that be considered as PAT, because one of the suggestions which I didn't put as a question was to change the name to Process Assessment Technology rather than Process Analytical Technology. My personal feeling, analytical is assessment so that--that goes to that point in the sense--would something have to be measured to be PAT? Is that question, so.

Dr. CIURCZAK: I had a thought about what Rick was saying. We're moving to a conclusion here in terms of controlling and looking at every--eventually looking at every tablet, some people would like. And we forgot some of the early work that Ajaz brought in. Some of the people who spoke of this taking a year or so, sometimes to make a process because there's large gaps. And should PAT encourage just substituting things, like, sticking in a probe to measure moisture rather than sending it out for Carl Fisher, et cetera, et cetera. In other words, shorten the process as it now stands. Give some feeling of confidence to the process engineers that these probes give us good information and work so that they'll eventually buy into the tablet-by-tablet down the line.

I think in terms of, if we waited for the Mustang--Henry Ford had waited for the Mustang, we would have been riding horses from the early 1900s to 1965. If we want to encourage instrument manufacturers to progress to the point where we have the speed, accuracy, precision to read tablets as they come off the press in milliseconds. If in the meantime we allowed them to make a living selling their instruments for such things as putting a probe into a granulator or a blender or things like this, you jump--you can't go from a grandfather clock to a quartz watch in one week.

And I'm thinking that if we focus totally on total control, are we now taking away from the very large and very real economic benefits of putting process, on-line instrumentation into play where we now take samples up to the lab and cut something from six months to six weeks. And wouldn't that, indeed, give everybody involved, including management the confidence to say, hey, they were right there, they're probably right about this PAT think now and let's control everything.

DR. BOEHLERT: I also would suggest that, perhaps, we can clarify this definition by dividing it up into two sentences. Right now, the way I read it is the focus is on the process with the dosage form and controlling and monitoring that. And, in fact, if you haven't controlled and understood the properties of the excipients and active ingredient that you put into that process, there's no amount of controlling and monitoring on the dosage form process that's going to give you a final product. It meets all requirements. And somewhere we need to get that thought in there.

It now talks about performance attributes of raw and in-process materials, but that's not something you do during the processing of the dosage form, that comes before, hopefully, you don't want to--not always, but hopefully, you don't want to start and find out you've got a problem midway through the process.

So you might want--if you divide that into two sentences, you might be able to get that thought incorporated.

DR. MORRIS: Yeah, and just to follow up a bit on your point. I think is that, certainly, the intent is not to exclude individual monitoring--monitoring of individual unit operations or certainly not API or excipients. Is that served by broadening this to not just--to be not just inclusive of manufacturing processes but to break it down more in the language to ingredients, unit operations and processes? I mean, it could be that simple.

DR. SHEK: I think, it's there. It talks about raw materials, right? The way it's written now, it says, that--

DR. MORRIS: Right, attributes of raw--

DR. SHEK: --attributes of raw materials. So I thought that's what, basically--

DR. MORRIS: Yeah, I was just saying to change the language to be a little more specific to say, you know, pharmaceutical unit operations and actors and excipients, but that might address--well, I mean, just so it doesn't exclude that.

DR. ANDERSON: I'd like to ask a question. Are we talking about a method of providing information and the quality of whatever it is we are manufacturing at various stages of the game, so to speak. And what we do with that information depends on what the information says. And if, in fact, my understand of this is correct, then it's not clear to me why control is a part of the definition. So, my question is, are we looking at a method of determining the quality of whatever it is we're doing at a given stage of the game, as opposed to actually having a system that controls what happens when we find something? You understand my question?

DR. KIBBE: All right I have that--the burden of authorship, I guess. This definition was the result of a lot of discussions about what we think process assessment technology or process analytical technology can do for the American public, for the agency, and for the industry. And we think it can do a lot of things beneficial for all of us.

First, it can, in some places replace older methods of releasing batches, more efficiently, more actively and more in a better way. In some places if applied correctly, it will help the company even control their own process so that they don't have to worry about the loss of a batch because the process starts to go bad part way through, they can monitor it on an ongoing basis, which we put in as timely, and make adjustments.

It improves the process because it will allow release quicker and, therefore, the timeliness of the information and the release of the batches and the time it takes to do a batch or do an individual product gets shortened and the cost to the company gets better. And it makes it, in some ways, easier for the agency, because the agency can then depend on a whole set of ongoing information whenever it reviews what's going on and it doesn't have to look at snapshots.

And we've recognized--and I hope most of you understand--that some of the snapshots that we use now to release batches are not very statistical powerful. We take very small numbers of tablets to decide that we're going to let a million tablets walk out the door. And we are going to feel, I think, much better about all those decisions when we put things like this in place.

And hence the agency, and I'm going to speak for the agency, even though I'm not in it, is very encouraging to get industry to do this because it then increases their level of confidence that good decisions are being made on a day-to-day basis that affects the health and well being of the American public.

So, yes, control is important and it's part of it and we're not making purely a regulatory definition, we're making a definition, everybody can work with and use in-house or on a regulatory basis and so on. And so I think that's important.

Timely is important, because information that's untimely is what we do now. So, I mean, we're trying to get better at this process. And so, I think some of those terms are good, now.

I agree with my colleague over here says that if you have a sentence that goes for more than four lines on a typewritten page, it probably is going to be confused. And the people who will confuse it the best are the lawyers. And I apologize to all of you out there who might be a lawyer. But they will, you know, just--and I think we might want to strengthen the definition on a regulatory side of the aisle by breaking it up into bullet points or something where we know clearly, exactly what we want.

And I also know that wordsmithing using 28 people to do it is a nonproductive process, okay? And while all of your suggestions are great, I think it's probably a good idea to let one or two people sit down and try to come up with the next stage of it. So, I hope I've helped.

DR. ANDERSON: Let me just clarify. I am not against the word control, what I'm--if you're talking about an NIR system, the NIR system doesn't control anything. It provides information. I think what I'm questioning is the placement of controls and in the sense--control in this particular sentence. It's not the system that does the controlling, something else happens as a result of the information that's provided by the system.

DR. HUSSAIN: To clarify, I think when we talk about PAT, we PAT because we said measurement information technology, the feedback control--the entire thing is a system in our mind. The measurement part is just one part of the system.

DR. MILLER: It goes back to the English. The system is analyzing and controlling, which is different--that what I want--I want gerunds in there, it's too passive and it goes to subject for confusion and other interpretations, so I agree, I agree a couple people need to wordsmith it and get it into a couple of sentences or bullets and that's how we'll work our way out of it.

It's--the more you think about the way it's written, it allows for too many interpretations.

DR. LAYLOFF: Okay, I'm going to invoke the Kibbe rule and we're going to stop discussion and Ron, you can talk to Ajaz.

MR. CHISHOLM: Yeah, can--I mean, just as an example. I was fortunate enough that we had our senior management together dealing with us and Ajaz last week and I put the definition in front of them, thinking it would all be wordsmithed and changed, just you all are doing at the moment. And lo-and-behold, not one single word was changed. All they said was, let's hope that the people that David was referring to--the definition that comes out of Europe and that lot and the definition that we have here are harmonized in some way, because I think that's probably quite important from an industry viewpoint.

I'd just like to guard against one point. We, as an industry, do not intend to test every tablet under any circumstances, because there's statistically no need. It would be going from the sublime at the moment to the ridiculous. We will test a significant sample, which I think is where we need to be.

The answer to your question, I think, Ajaz actually answered and that is that you have to think of these systems in their entirety. We would, in fact, gather data in batches, that's on the raw materials, blend times changing, the tablet analysis, et cetera. Over a large number of batches. Firstly, we'd do it during a batch to make sure we weren't going out of specification. But our data would then be, as it were, data-mined and analyzed to look for long term trends so we could understand the processes better.

In that way, it's about control. But you're quite right, it's not about instant control, because if you ain't got it right, you ain't gonna get it right.

DR. LAYLOFF: As always. Okay, if we can move on now. Have we identified--this is Question 4, on page 3: Have we identified the key issues--real or perceived--that can be categorized under the heading of regulatory risk or uncertainty, and do you agree with the current thinking on how these risks may be minimized?

DR. HUSSAIN: I think the way we sort of approached this is for marketed products with good compliance history, essentially within the known history problem. We believe the quality is good it's fit for intended use. I think I want to keep emphasizing that this is focused on improving the process and we are not questioning the quality of the product. So with that in mind, we sort of proposed that how we would address that.

And one of the main risks that is being identified by industry, as it happened today, also, is the risk of finding flaws in the current system. And what our position is the current system is fit for intended use. There's no safety and efficacy concern. So there should be a way to resolve that and then move forward and not be penalized for that.

And the point I--Dr. Woodcock made at the Science Board presentation was some of our current testing could create that. For example, with content uniformity, it's a situation where no tablet should be outside 75 to 125 and on stage one, essentially, it says that when you test 10 tablets if the mean is between 85 to 115 and the RSD is 6.8 percent or less, that's acceptable.

If you assume that we normally distributed system, what it means is 6.8 percent RSD would actually we'll have tablets outside 75 to 125. And when you increase the sample size, then you will find those 75 to 125 and it means that every batch is out of specification, literally.

So what we are proposing is and the Science Board endorsed that--when we find something like this we will use a rational statistical approach for addressing that and not say this is out of control--so, my glass is out of control.

DR. LAYLOFF: Ajaz has become out of control

DR. MORRIS: If I could just comment, I think that goes back to something we talked about at the first meeting, which is reconciling the specifications from two different methods. I mean the errors that are associated with a PAT, as opposed to a malcompendial [ph] test may be different, but if they map to each other in a statistic--and I'll leave this to the key mathematicians and statisticians, but then you're not out of specification as long as they're mapped. But I think that's the--that's the--I don't know what the word would be, biggest request on behalf of the industry to the agency is that that be recognized. That, in fact, when you do have tails of the distribution that we don't now see that not impugn the product. I think that's what it comes down to.

DR. HUSSAIN: Just to summarize for the committee. The current thinking is, the safe harbor concept that Dr. Woodcock has talked about. Essentially the way we have framed that safe harbor concept is that we believe that the current system provides product of good quality that is fit for its intended use. During development of PAT applications on marketed products, the information collected using experimental PATs would be considered as research data. Only approved regulatory tests will be used for product release and regulatory decision. So you would be--feel free to sort of collect that data and then we can find a way to--if there are flaws, then how we would address that, but not be penalized for that.

DR. MORRIS: IS it audited, though still, is that data audited, or is that an open question?

DR. HUSSAIN: Not for agency purposes, it's research data, so you would use that for making or transitioning into the PAT application.

DR. LAYLOFF: I think the legal reference methods are going to be the approved method or the USP method, I mean that is the benchmark, that's what you operate from and if you have other data, it's not really relevant from a regulatory point of view, it's academic.

MR. FAMULARE: No--as part of a regulatory inspection, that wouldn't, you know, if you're doing--if your R&D facility normally isn't inspected and in terms of somebody's doing a post-approval GMP inspection, if you're doing R&D work on PAT that wouldn't be the normal course that an inspection would take you through. Once you implement PAT or PAT becomes part of the paradigm , you know then we have to look at it--

DR. LAYLOFF: Up close.

MR. FAMULARE: --from a reasonable perspective. And as Ajaz alluded to, if you're using a specification of content uniformity based on limited sampling, then we have to have the proper guidance for our investigators and the teams doing these inspections that you have to see it through a different set of glasses. It's a different statistical paradigm. And the company, basically, the bottom line is the company is taking this for use of product improvement and we shouldn't do anything in our inspectional [ph] approach to hinder product improvement, otherwise, we've defeated the whole purpose of our inspectional program.

DR. SHEK: So, in practical terms, okay, if we're going into 4a, okay, where it says robust products, and a sponsor decides to look into use PAT and they found some various data there, you know, information. Is this data will be open now to inspection through, let's say a general GMP inspection and the question would come, have you done something about it? Here you have the data, and I would assume some concern might be there. And that's not R&D, now it's already in production, manufacturing, maybe it goes to a technical services group, to look are there are some findings there and it still passes, you know, the specs everything is there with the test, but we have some findings which may be directing you to that you have to do something with the product.

DR. MILLER: And the follow-up to that, if I may, is that that was the discussion about safe harbor, all along. It was finding unintended results, has nothing to do about doing routine testing in a PAT environment for whatever attribute you want to define, it was safe harbor for unintended findings. And we're skirting or skating, excuse me, away from that point a little bit. And I want that to come into focus.

DR. HUSSAIN: No, actually, we are asking the question to you, I mean, the question is being posed to you. What is the committee's thought on the safe harbor concept in this instance? What we think is, in the sense, and I'll have Joe sort of answer, also, is to say that now you have moved PAT to a manufacturing line--

DR. MILLER: Right.

DR. HUSSAIN: --it's still not your primary method, you're still collecting data to see whether it's suitable and you're actually going through the validation process. Now, you routinely see a few more tablets which are outside, quote/unquote, "specifications."

The decision, I think what we will have--as we go through the--during the validation process of the PAT, you already have a validated process of the old method. That method will be used, so we're not using--so at some point we would need to meet and say, all right, with PAT you are seeing these defects, what are your new, either acceptance criteria based on sound statistical principles? So that your process is the same used before and after. So you really have to come up with a new set of criteria how to evaluate those numbers.

MR. FAMULARE: The existing regulatory paradigm, even going back to our previous subcommittee meeting, will remain sound, so we're not going to use that new data, now that it is online in the manufacturing area to impugn the percent as long as your existing validation and regulatory methods are working and doing what they are intended to do.

And as Ajaz said, what the next step would be, well now, you see this trend, it's not something for our investigators to report on the 43 or initiate some regulatory paradigm. It's something we may come back to you and say, okay, where are you going now with the PAT and what will we do with this process?

DR. MILLER: I'd like to comment and, just for the record, with discussions that we've had externally at CAMP and also with Bristol-Myers Squibb, I would like this to go down as part of a definition for safe harbor. Application of PAT to a particular process product will be at the sole discretion of the manufacturer and I'll--don't write it down, I'll give it to you again, but just think about the words. The application of PAT does not necessarily imply that a critical parameter has been identified. The FDA agrees that a company cannot be inspected, held under unusual scrutiny, or be liable for regulatory requirements as a result of data generated during the PAT development and implementation phases. And if we need to write it on the board, we'll do that. But that's the beginning of where we are with safe harbor. And it goes to, again, this aspect of finding unintended circumstances.

DR. HUSSAIN: What I would suggest is in this instance, if you could just share that definition with all the committee members and that the committee could make our recommendations on that.

DR. MORRIS: I think one point is that and in Ron's definition, as well, is that, obviously, any company sees dramatic excursions, they're not going to wait to be told to look at it, but during the phase when there is still a question of whether or not the implementation, as we talked about this morning, the implementation is proper, then you can get spurious results that, in fact, don't reflect the process and as we were talking about with Hank, is that the best way to find polymorphs is to scale up and the best way to find flaws in your sensors is to scale up, as well. And I think that's the spirit of the definition.

DR. MILLER: The follow-up is in the spirit of this is--these are approved processes, there is no question --whoops, likewise, Ajaz--there are not questions about the product or the process but, you know, technologies are technologies and Acts of God, so we need to understand that.

DR. LAYLOFF: I wanted to make a comment on the, you know, we have a discussion and that's very useful. However, I think it's important to know that many of these comments should be submitted to the docket as public comment, you know, so that they're out on the docket.

DR. MILLER: Well, we appreciate that, but this also stimulates immediate thinking and challenging to our committee members and anyone during the two days. I we'll be glad to put that down in writing, very clearly, but it goes to the process of stimulation your thinking.

DR. LAYLOFF: No, the stimulation is fine, but send it in to the docket.

DR. HUSSAIN: Tom, I'm not sure, I mean, I'm not sure, the whole thing is sort of a public record anyway, so the docket, we had sort of a different thought for the docket was actually to get different type of information, so this is sort of a suggest from a committee member to sort of have the discussion here, and that's relevant to that.

MR. FAMULARE: Just to follow up on your thought, while PAT is developmental, you know, you have all those concerns, but you have the concern that Hank raised in his robust process where he gets this outlier at 62 percent and what does he do with it? Well, maybe PAT will help him, you know. So then, you don't want it one way, but we'll give it to you the other way if it helps you.

DR. KIBBE: Let me just say that when we discussed this, we discussed the regulatory environment being empowering. And I think your points are well taken and I think we would encourage the agency's guidance to take them into account and empower the companies to try PAT out, to use it on a process and if, for some reason, it doesn't help them control that process well and meet the current standards, then we're not going to make them do it. All right? But I have a sense that some of these unforeseen boulders are going to be bumps in a process to a better environment all around and that, in the process, of developing a PAT if they find one of these things and they want to continue to forward, they might find ways around it, they might find cures for it, or they might find a way of correlating the data they get from their PAT to the data that they already get and say, all right, the standards on our standard testing is x and the standards for a PAT testing is you and the two are directly correlated, we still produce the same product, is that okay with the agency? And then the agency can go forward.

And so, while it's nice to worry about things that might happen, we haven't opened the closet to monsters incorporated on this. I think we can go forward and I think we need to be clear with that. We are not going to force a company that takes the energy to look at PAT and try to develop something to implement it just because they're tried it. Okay?

DR. BOEHLERT: I just wanted to make one other comment. It's not unheard of now for the FDA to come into a company who think they have a product and process well under control and make comments on the acceptability of that process and it's controls. We're not going to eliminate risk here, you know, that risk is always there that somebody's going to take a look at what you're doing and say it's not what we think you should be doing or how you should be doing it.

The concern is that once you start working on PAT and you have data on hand now that confirms that observation, you know that the agency will not look at it as a safe harbor kind of concept, but look at it as, well, we could have told you that if we'd come in earlier, that you have a problem with your process. And you know that people are not going to want to generate more data that will just, you know, be on hand to show that they do, indeed, have a problem. Because you may not think what they have now is okay.

MR. FAMULARE: But aren't--if there were to be an enforcement or any type of an issue it would have to be based on the conventional, existing paradigm, not what PAT did or added to it.

DR. BOEHLERT: Yeah, exactly, but having additional data on hand, may not help that situation, as far as the company is concerned.s

DR. MORRIS: I think one point is to--

MR. FAMULARE: I'm sorry, just to finish that thought. Then the company already knew it from the conventional data and this is what just icing on these cake so--

DR. MORRIS: And I think that was sort of the point I was going to follow up to your 62 percent point is that what we've seen is that processes that are fairly robust, at least in our hands even at scale at some point, certainly not as much as the, we haven't done as much full-scale work as the folks across the aisle, but typically are benefitted by the application of PAT. They reflect that.

And the processes that are on the edge, everybody already knows they're on the edge, I mean, that's not a secret, so I think, to Judy, that's to your point, is that it's certainly not going to make a process that's on the edge look any less variable but, hopefully, it points out opportunities for improvement.

DR. HUSSAIN: To sort of re-emphasizing that we truly want this to be a win/win and the lack of trust and the lack of the history has been--and we have to rebuild that trust and as you go down the questions, you can see how we're trying to do that.

One of the aspects is, in the sense for PAT-based submissions, as we identified--that's the reason I was focusing on the definition is because we really need to distinguish PAT applications and inspections from the rest of them, because we are creating a new team which should be the only folks who are reviewing and inspecting these things and not anybody else. So you have, essentially, a new regulatory paradigm emerging from this. So, as you go down the questions you'll start seeing how we sort of intend to handle this.

So what the safe harbor concept simply is--it's a good compliance history, it's an approved product, safe for intended use, there are no safety and efficacy concerns, most of the time and I don't expect, personally, to find any safety or efficacy concern. There will be concern of deviation from maybe some established specifications and that, I think we probably want to address through statistics, a more statistical approach.

And then, I think if variability can be reduced with the application of PAT, I think it would encourage companies to do that. And companies would, obviously do that. So that would be the sort of paradigm. So.

DR. MILLER: And that comes from the discussions that were held at the Science Board--

DR. HUSSAIN: Right.

DR. MILLER: --Janet and you were involved speaking to the statistical tails that occur so, you know, that's out there and we have to use PAT to potentially control that to a finer level.

DR. HUSSAIN: Sort of a personal point I'd like to make here is this --in a sense, I think, the zero-tolerance-type of limits that we have worked under USP and so forth. Keep in mind, USP is not a release test. USP's a market standard. It was never intended to be a release test. So what it simply means is if somebody takes the product from the market and tests it according to the USP, you have to meet that standard. It's that standard, so, and so people sort of blur those things up.

At the same time, I think with the continued uniformity as it's outlined in the USP right now, we know if it's normally distributed you will have numbers outside that. And today, how do we deal with that situation? We actually throw away batches because it's out of specification and in some cases the quality the batch that is rejected and the quality which is accepted is no different. So, are we just feeling good about having a zero-tolerance and saying we don't want to deal with it? This is a way to really deal with the science issues underlying the whole process.

DR. MILLER: Ajaz, that also goes to the harmonization point, because there is some concern to the fact that, well, these products are tested as USP. So how does Europe or other countries or how will they accept potentially a product that doesn't have a USP test and then so it's an alternate test?

DR. HUSSAIN: You always have the USP test, you have that USP test.

DR. LAYLOFF: And the USP does not require that you test by the monograph. It says that if tested by the monograph, it has to comply. But you can use alternate technologies as--

DR. MILLER: Well, then it goes--I appreciate that, but it goes to labeling and nuances, I think--

DR. LAYLOFF: No, it just says, if tested, it would comply to the USP standards.

DR. HUSSAIN: There's no difference.

DR. LAYLOFF: I think we've hit most of the 4s, haven't we?

DR. HUSSAIN: The 4a, the question 4a was essentially saying that the statistical criteria, essentially the normal distribution and the inherent variability that we currently accept is one of the reasons for finding flaws.

Are there any other problem scenarios that would need to be considered for products which are in good compliance. I mean, that's the question. So, now we can go on to the next one, then. So, everything is right on target.

DR. LAYLOFF: We didn't much enthusiasm on that one.

DR. HUSSAIN: Well, I think the question 4b, I think could be looked at from two different perspectives. One is that if we are able to say that it's a good compliance history, we don't need questions 4b, that's one way of looking at that. Or do we should consider 4b, I mean, that could be the way of addressing that. Because you will--may find something which should be corrected and then you really need to have a risk-base, not sort of use the penalty format, you say correct it over a period of time or something of that sort. A risk-based approach would be needed.

DR. MORRIS: One point on 4b, I think, is that, you know, there may be times when you try to apply PAT, say, to blending or something and there's just no correlation at all. In which case, you say, well, this is not the sensor or I haven't, you know, implemented it properly. That seems to be fairly straightforward. But it comes back to, then, if the industrial scientists make that call, then it comes back to the training of the reviewers and inspectors to recognize that, I think, as well. So, it's training on both sides, but to me that's an easier hurdle to overcome.

MR. FAMULARE: I think a lot of 4b would also be the use of enforcement discretion when these issues are found and what steps the company is taking towards resolving them if they are legitimate issues that need to be addressed.

DR. MORRIS: That's corrective action.

MR. FAMULARE: It's the step towards the--

DR. MORRIS: That's the other side, yes.

MR. FAMULARE: --which is part of the normal paradigm, you know, steps towards compliance is the most important consideration that we look at.

DR. RUDD: I'm sorry, I think I'm slightly behind. I think my comment relates to 4a, but it will be very quick. Just ready to re-enforce the fact that we need to recognize that we'll see statistically more variability as a result of the application of PATs and so, I think in terms of any training component, we just need to get a good understanding of what that additional variability might be. Don't have any answers to that, but I think it's just a recognition that, you know, the expectations need to change. Sorry for being a bit behind, there.

DR. LAYLOFF: Okay, I'm going to stop this. Did you have--want to comment on this?

MR. CHISHOLM: Just the main thing that came out when I put these questions to people really was what we'd like from the agency is more of a definition of what--when I put this to a number of people and they said really the questions that come back are they would like the rest to be more specifically defined, I think, rather than generalities. What does constitute a problem, you know, I think there's a variable feeling in the industry that it's still a little bit willie [ph], although everybody's getting a very warm feeling about all the correct things that are being said. You maybe have to be slightly more specific. And I'm thinking, not so much of existing products here, as even for new products. It just goes back to Dave's point there that there will be statistical variations, which is something the pharmaceutical industry's never dealt with in it's life. So, it's not a yes or a no situation anymore, it's a maybe situation.

And we have to give some thought to that because it is a very risk averse industry. So, it's just a comment rather than a question, I think.

DR. SEVICK-MURACA: Yeah, I would like to see this issue of statistical variability along new PATs somehow being formally recognized in our guidance that, as new PATs come through, that there has to be a cogent scientific approach to saying when, you know, to handle the scientific--the measurement variability. And that's the thing that I'm really concerned about, because sampling sizes are an issue here, depending upon low-dose, high-dose, you know, there's going to be enormous ranges of variabilities, and these need to be addressed in how we're going to regulate and how we're going to put PATs into the validation concept.

So I think that we need to do some training on that.

DR. MARK: You know, maybe I'm showing my ignorance here, but I'm not sure what it means to say you have risk-based approach. Is that a standard pharmaceutical term or what's the meaning of it in this context?

DR. HUSSAIN: Well, I think, everything that is focused on safety and efficacy, most of the time we don't think there's a safety and efficacy issue. But if there is a concern with respect to safety and efficacy, for example, we see a number of tablets at 60 percent and so forth, that the dose truly is lower for a drug, then a corrective action would need to be sort of worked with the agency and so forth, so there's a risk associated with safety or efficacy.

DR. LAYLOFF: Okay, we're going to stop this discussion at this time. We've invited a speaker from NIST to be with us this morning. James Wetstone, is going to tell us a little bit about what NIST does.

MR. WHETSTONE: Thank you, Tom. Let me get this thing going. There we are. Well, thanks again for the invitation. My name is James Whetstone, I'm the Chief of the Process Measurements Division, which is one of the divisions at NIST that's in the Chemical Science and Technology Laboratory; I'll speak a little bit more about that and, again, thanks to the committee for allowing me to take a few minutes of your time to tell you a little bit about what NIST does and how that might have some impact on process analytical technologies as they might be applied in the pharmaceutical industry.

First of all, these are some discussions of what NIST does, what we think we do, how we do it, what our core values are, our mission and vision statements, I'm not going to repeat those. I think you all can read that about as well as I can.

We're a presentation of the Department of Commerce. Our mission is strongly oriented toward providing measurement technologies and standards for industry and government agencies. And we strive to realize our mission and vision and use our core values in order to do that.

NIST is a broad--has broad technological capabilities that run through a variety of industrial applications or interests all the way from electrical power where, you know, everyone has one of these things sitting on the side of their house and they're all traceable to the primary standards that are maintained, actually, by the electricity division just across the 270 here. One of the tall buildings you saw over there was our administration building.

All the way from electrical power to medical testing, dentistry, transportation of various sorts and refrigerants here means that some of the work that NIST has done in the past, about 10 years ago, accelerated the acceptance of the Montreal Protocols for new refrigerants that are not global warming materials.

This is an organizational chart of the organization of NIST. It's composed of a number of things, NIST, actually was derived from the National Bureau of Standards about 15 years ago. And there were some new duties that were given to NIST at that time. And those are embodied, really in three places.

One is the National Quality Program, Ajaz mentioned in his presentation of the Baldrige Award. And the Baldrige Award is administered by the National Quality Program. The Advanced Technology Program is a funding vehicle for high-risk industrial research activities. The Manufacturing Extension Partnership is akin to the Agricultural Extension Agent system that has existed in the U.S. through the Department of Agriculture for almost, I think, over a century, actually.

This puts technological expertise throughout the states available to, primarily, to small manufacturers.

These seven laboratories are what we call the old Bureau of Standards. Those are the technical capability of NIST, this comprises about, oh, 80 percent of our total staff.

What I'm going to talk a little bit more about is the Chemical Science and Technology Laboratory, where the technical expertise is lodged that is pertinent to the discussions of this committee.

CSTL visions and missions are similar to NIST. Specialization has to do with chemical biomolecular and chemical engineering activities. What we try to do is enhance U.S. industries competitiveness and capabilities through the application of new measurement technology and standards. Part of this has to do with the assurance of equity in trade and, obviously, it impacts public health, safety, and environmental quality, also.

Our activities are really enunciated by these three goals. We have a measurement standards activity, which is a core mission responsibility of NIST. We provide--we, CSTL, provides standards in these areas, as I mentioned above. We have a quite extensive reference data activity that is centered on chemical reference data of various types and biochemical reference data.

And then measurement science is that area of--that is the well spring of our technical capability. We engage in a wide variety of research activities that are aimed at ultimately improving the ability to make measurements.

This is just an organizational chart of the Chemical Science and Technology Laboratory, Bill Koch was supposed to be here today to give this presentation, but he's out of the country so I'm giving it. My division has somewhat more application to process analytical technology than some of the others, although all of them have some contribution to make to there.

And just to emphasize, that the way we're organized is really by discipline. So, if you look at this Analytical Chemistry is just what it says it is. Physical and Chemical Properties is just that, physical and chemical properties of both materials and chemical processes, primarily; some physical processes; Surface and Microanalysis Science is primarily world-class microscopy capability, all the way from optical to various types of charge particle-based microscopies.

We have responsibility for the kinds of things listed here, these are the group names, we have responsibility for the national measurement standards for these types of what I call thermodynamic variables, which are, in many cases, intimately attached to the manufacturing processes, certainly that the pharmaceutical industry's concerned with.

And the Biotechnology Institute--or Biotechnology Division is--looks at biotechnological processes; structural biology is an important piece of that. And in that we have a collaboration with the University of Maryland and its Center for vast research in Biotechnology.

We speak of our programs in the terms of the industries that we try to serve with advanced measurements technologies and standards. Certainly health care is a pertinent issue today.

Our facilities, as I said, are mostly just across the interstate. You're certainly welcome to come. It's a little bit harder to get in the gate these days than it was about a year ago. But it's still not difficult. You might see this building as you go back down the interstate to the airport, that's our administration building. We have a facility, NIST has a facility in Boulder, as I mentioned, there's the CARB facility which is just about five minutes away from here. And we have some facilities in Charleston, South Carolina, in collaboration with National Ocean and Atmospheric Administration.

What do we do? Well, we provide standards for a lot of different things. And what I'm going to do is run down this a little bit. I picked some of these things because I felt like that they would have application, perhaps, to this particular audience. Raman spectroscopy has become, certainly a process analytical technology that's widely used. Mel Koch here, from CPAC and Kelsey COOK from MCEC have organizations that are practitioners of that art and they're practitioners of those arts in industrial contexts and there's a fair amount of experience in having done that. Not in the pharmaceutical industry, but certainly in many others.

Spectrophotometry, athopical [ph] absorbency standards, I think there was some mention earlier today about the penetration of that particular technology into the pharmaceutical industry and what we do is to provide the absorbent standards for those devices.

Reference data, well, there's a lot of it, and I just put some stuff down here that was, I thought might be useful in this industry. Certainly the mass-spectrometric database is sort of a hit-and-miss product that's in just about every analytical mass-spectrometer that hits the street. It's sold through our office of standard reference data to most of the mass spec makers and they incorporate it into their software. It's updated about every two or three years.

And then, as I said earlier, we provide instrument calibration services for these kinds of things in my division.

Just a quick thing--one of the things that the industry came to us about three or four years ago was the fact that raman spectrometry is getting to the point that it is, as I said, a widely used process tool. We think there will be issues about the ability to look at the intensity response from one instrument to the next. The ASTM committee on spectroscopy felt that was the case, too. And so what we've developed is the first fluorescent standard that can be used to calibrate in situ raman spectrometers. It allows you to do a number of things; one, it allows you to compare one process to another without moving the instrument from here to there. The first thing that we've done is the 785 nanometer excitation source is one of the most commonly used sources in industry at this point, so we decided to do that one first.

It will be available for sale either--certainly by September, perhaps, by now. I signed the report on analysis of this thing about three or four weeks ago. We intend to go to the other commonly used excitation sources and, as I said, we expect the impacts of this to assist the industry in doing comparative measurements in process control.

So, with that, just going to put this back up again. We try to work with industry as much as we possibly can, with other government agencies. Typically in a third-party, disinterested-party role and, certainly, welcome any kind of comments you might have.

I just thought that I might add a plug. And that's this plug. Mel is certainly a member of the IFPAC Board as is Rick Cooley, and Kelsey's involved and there's another--some other friendly faces here. I think Ajaz put together a session at the IFPAC meeting last year. This is the International Forum on Process Analytical Chemistry. It's a place where you can go and listen to folks who have had a lot of experience in applying various types of spectroscopies, primarily, and to--it's beginning to be some sensors as those technologies are beginning to mature to process analysis and control issues.

So, with that I'll stop and thank you, again, for your attention.

DR. HUSSAIN: One, just to, NIST, I think, would be a very, very valuable partner to FDA in developing with respect to standards and things that will evolve. What we have been trying to do is link with NIST and, in fact, at the next meeting of PAT, we might offer an opportunity to spend a day and have a workshop at NIST on some more technical aspects. So that's something that we are considering right now. In addition to that, I think the information technology--standards for information technology also, I think, NIST can help us in that regard and we are exploring that possibility. So, thank you again.

MR. WHETSTONE: Thank you very much, Ajaz.

DR. LAYLOFF: I think, Ajaz suggested that we meet with NIST because they might be able to generate standards necessary to provide calibration for various sensors.

And with that we will break for lunch. And we'll start again at 1 o'clock. Thank you.

[Lunch Break.]

A F T E R N O O N S E S S I O N

[1:12 P.M.]

DR. LAYLOFF: We have four people who have requested to make a presentation to the committee at the open public hearing. Dr. Justin Neway, from Aegis Analytical Corporation.

DR. NEWAY: And what's the magic secret to getting the slides to show?

[Technical Interruption.]

DR. NEWAY: Mr. Chairman, ladies and gentlemen, thank you very, very much for this opportunity to speak to you today.

My name is Justin Neway, I am one of the two founders of a company called Aegis Analytical Corporation. We're a software company based in Colorado area, near Denver and we make software systems, develop and supply software systems for pharmaceutical manufacturers, specifically.

What I'd like to speak to you about today in the 20 minutes or so that I've requested is to present you with a perspective that I haven't heard discussed yet, except, actually this morning, some elements of what I'm about to talk about came up. And I'd like to use those openings to illustrate a particular set of problems that I think need to be taken into account with respect to what PAT does, in terms of bringing out guidance and implementation in the industry.

I've called my talk Implementing New Process Analytic Technologies: The underlying challenges.

And I'm speaking specifically about in the manufacturing area itself, rather than in process development or R&D. And there is a specific set of problems that I'm going to address today.

To get started, I'll give you a little bit of background on myself and the company so that you know the basis on which I'm making these statements. And then just to recap some of the benefits of PAT as we see them and I think as many manufacturing professionals see them.

I'll outline these challenges and rather than just leaving you with a bunch of whining, I'll actually attempt to tell you what I think can be done, both from an industry point of view and a vendor point of view and a regulatory point of view, to help these things converge and achieve the kinds of objectives that I know you have as a PAT subcommittee.

So, to start with is that I'm a trained biochemist and microbiologist. I spent 15 years in pharmaceutical manufacturing in several different companies. And I became very intimate with the data environment associated with process development and manufacturing in pharmaceuticals and biotech.

After that, I started Aegis about five years ago with venture funding from the venture arms of GlaxoSmithKline, Merck and Aventis. By this time, I and my colleagues had made presentations and visited, essentially, the top 30 biotech and pharmaceutical companies over the last 5 years. Several different sites, several different organizations within each and we have a tradition of actually, convening customer advisory panels to develop requirements for our software that help us more closely address what the industry's needs are. So this is the backdrop for the statements I'm going to make.

You can see that I've been on both sides of the table, both as a user wanting to solve the kinds of problems that pharmaceutical manufacturers have and now actually being in the position of being a vendor supplying software to address those issues.

Now to quickly summarize what I see as the benefits of PAT implementation, they're pretty obvious--we've gone over them this morning. I would like to emphasize the two in the middle here, most particularly. Shorter cycle times and batch release times and moves towards parametric release. I was very pleased to hear the interesting new distinction coming up on parametric release being real time release. I think that's something that is extremely important to distinguish: the fact that there is a time element involved.

Okay, so we want to improve all of these things, we want to achieve those via PAT implementation, but what about today's failure rates, compliance, and yield problems themselves?

The challenge in quality compliance, I think, was outlined best in what I found to be Janet Woodcock's words earlier this year. U.S. drug products are of high quality, but. And we know these buts: increasing trends towards manufacturing problems; recalls; disruption of operations; drug shortages; negative impacts on NDAs; low efficiency manufacturing QA; slow innovation and modernization.

Why do these problems occur? Having been in the manufacturing business myself, in the trenches, as it were, I know these people are not under motivated or somehow not trying to achieve these things. There must be obstacles and reasons why this is so.

And I think the obstacles can be summarized in this slide here and the two that follow. I'm defining here what I call data-intensive decision making. And that is where you need to make a decision for which you, first, need to gather data from various systems in your manufacturing operation that allow you to make that decision rationally.

Those decisions come up in two broad areas. One is in quality and regulatory compliance, GMP, in general.

And the other's in process control and stability. It happens that these are closely related, but it's often the case that manufacturing professionals don't necessarily see them as being closely related. When I talk about quality and regulatory compliance, I'm talking about parameter review for batch release; I'm talking about defensible specifications; investigation of atypical batches; manufacturing process validation; production trend analysis; annual product reviews. You'll recognize right away that these are not something you simply sit down at your desk and begin to expound on. You need to gather data first and do investigational work, descriptive analysis and investigation analysis to be able to make good decisions about them.

On the control side we've spoken and heard much about that this morning: shorten process start-up times and scale-up times; shorten troubleshooting times; and reversing adverse events and trends; improving process stability; product quality and productivity. In general, improving return on net assets. These are things that all manufacturers want to do.

But there are a set of constraints within pharmaceutical manufacturing that make that particularly difficulty. What are those constraints?

Well, one of them is, in fact, what I term the real manufacturing data environment. To summarize it, we could say that the necessary data are located in many separate places. Okay, they are all over the place. And for good reason. Systems have grown up over the decades to supply specific needs and specific parts of manufacturing and, as a result of supplying those needs, they've accumulated data about those needs.

Here I show a LIMS system, a SCADA PLC DCS-type systems; batch record systems. SAP would be the archetype of the ERP system. Many people have data warehouses that house subsets of this data.

But this data universe, you know, this environment serves an excellent purpose. It allow people to do their job of manufacturing pharmaceuticals and releasing batches. But it also presents a significant difficulty, because each time you want to do some kind of investigational analysis or data-intensive decision making, you often have to go to several of these data sources to get the data. And that, today, takes weeks--not days or minutes or hours, but weeks--in some cases, months.

And that's the reality of the data environment that I've seen first hand and worked in first hand.

Now when we speak of batch release and shortening batch release times. You've heard G.K. Raju speak about how much time it does take to release a batch. Simply having a new probe does not speed that up. The data for a PAT instrument or several PAT instruments would accumulate in just one of these systems, typically.

Batch release consists of looking at conformance of parameters for raw materials, unit operations, and final product and the data for all of that resides across all these systems. PAT instruments are just one or two of the components required for batch release.

So, I can speak about, then, data--decision making inefficiencies. And I'm being generative here when I say inefficiencies, okay? There are problems, challenges.

It takes several weeks of manual data retrieval to be able to do the kinds of data-intensive decision making we've spoken about. And you can consider batch release, as I've mentioned, to be one of those decision making tasks; whether it be PAT-involving, or otherwise.d

What happens is what I've called spreadsheet madness. In general, vendors have been supplying customers with Excel add-ins, as a way of doing analysis, okay--or environments in which they're free to write any command line program they like.

Well, as I mentioned, I'm a biochemist and a microbiologist, I don't happen to like writing programs. I hire other people to write programs for me. Process engineers sometimes like to write command lines.

But we find that quality professionals, process engineers, plant managers, supervisors, operators, in general are not interested or wiling to write command lines. They want point-and-click systems. And why shouldn't they have them. They're abundant in other areas of where we work today.

There's a bewildering choice of inadequate software. By that I mean that most analysis tools, most decision making systems have in fact, grown up to serve a general set of needs and they're being force fit into the manufacturing environment. What people need are easy access--meaning, point-and-click environment--to those analytical techniques that are most appropriate for the pharmaceutical environment for the kinds of data-intensive decision making I've described.

And, finally, the ways of communicating results, I find, even today are antiquated. In general, tables of numbers vast numbers, lots of numbers where people have to do mental additions and subtractions are what people are communicating to one another. When we have so much computing power that three-dimensional imagery is easily accessible.

In fact, that leads me to a description of industry trends or some industry trends. There are plenty of new instrumentation coming along. Part of what we're talking about here has to do with that: cheap data storage; computing power; increased enforcements of GMP; patent expirations; industry consolidation and globalization is forcing companies to try to identify centers of excellence in manufacturing; reduce redundancies; and focus on specific manufacturing plants or regions of the world where they can produce the kinds of quantity and manufacturing efficiency that they want to do.

The technology already exists to adequately deal with the inefficiencies I've described.

And I want to give you just two examples of the kind of technology I'm talking about in a couple of areas. One is a feature extraction capability.

If you imagine for a moment that you have some probe, let's say it's a new PAT probe, it's measuring some signal as you see on the right that's triphasic. For proper release of the batch, and this is in a theoretical, okay? For a proper release of the batch, one has to define the rate of increase of the middle right here. Doing this needs to be as simple as I illustrate here. Point to the beginning of the curve, point to the end of the curve and get the software to derive the constants. Now you can release this batch if it, indeed, fits the specification for this centered rate of change.

To illustrate what I mean by improved methods for illustrating results. I give you what we call a visual process signature. In this example, the tall peaks are the ones that most effect the process outcome and I'm defining the process outcome here as the back peak on this surface which is dissolution rate. It's often the case we find, and we do work for people to illustrate this--that the parameters that most drive the process outcome are distributed across the process.

In this example, we've got an API parameter; a mixer parameter; a drier parameter, and a coding parameter that all contribute the majority of the variability to the process outcome being dissolution rate. One or two of these might be a PAT, okay--technology or probe. The others are the traditional measures that span the process from raw materials. All that data still needs to be retrieved, made available and analyzed so that a batch can be released more quickly.

Okay, now we come to the wrap up? What's in it? The advantages of PAT. I see PAT as an excellent balance between compliance and economics. And we have before us, I believe, a rare opportunity to be able to drive change in the industry from an economic perspective rather than a disciplinary perspective. It refers more to the win/win that you've been speaking about all along Ajaz.

FDA wants better compliance to assure safety and efficacy. They also want better manufacturing efficiency to lower prices. The industry wants to comply, but they lack the necessary software capabilities, in general. And this is my assessment from my years of speaking to people.

They been building the cost of failed batches into prices. If only we didn't do that, presumably prices would be lower. And they want a shortened cycle time to improve process economics. Now, for this to work, the realities of the manufacturing data environment must be dealt with. What can we do about it?

From an industry perspective, I suggest boosting manufacturing IT spending. And I've underlined manufacturing IT because I think this is the area that needs encouragement from bodies such yourselves. There has been plenty of money spent on IT in general in pharmaceutical companies, but I feel it has been misdirected and not applied specifically to the manufacturing area as it should.

Include manufacturing users in budget prioritization. That means people who actually have to do with data-intensive decision making should be part of the decision making process in how those funds are allocated with respect to manufacturing IT.

And that, of course, would lead to implementing the underlying IT infrastructure needed for PAT. I mean these things just go hand-in-hand. PAT, I think is not such a huge revolution when we look at what the trend industry has been doing up to now.

For vendors, let's make better software systems and work with the industry to define those needs as opposed to making broadly applicable systems that don't get well used in the pharmaceutical area. And let's be honest about software capabilities. Let's face it, Excel add-ins are not the way to solve these problems for people who want to do the kinds of analysis I've just described--and to provide better training and support.

On the FDA's part, continuing emphasis on the GMP compliance and outreach is critical. Because making a position clear that this is not a choice that we must, in fact, comply with GMP is very, very important.

Now, here's something that I haven't heard discussed that I'd like to really emphasize. And that is making opportunities--taking opportunities to emphasize positive PAT economics. And I mean, in very concrete terms. So, I'd like to suggest that data be gathered and that a very concrete ROI case be made as part of what this committee does, to illustrate the very real economic benefits of shortening batch release times and product development cycle times. I think I've just heard about them in generalities up to now, I may have missed something, but I'd like to encourage a very concrete development of that case.

And so here's what I would suggest for this committee, if you'll forgive me for doing so. Continue the so-called safe harbor policy development. There may be a better term. Account for additional necessary manufacturing infrastructure. In other words, wide implementation of PAT, whether it be on existing processes or new processes will come to naught, unless we also develop the necessary infrastructure to make the whole thing work as I've described, because it's not just about that next new measurement or about the technology, in fact. It's about the whole systems approach that's needed.

Publicize compelling economic justifications, accounting for the hard costs, the soft costs, and the social costs.

Sponsor industry/vendor working groups to define needs, develop requirements and provide feedback. I would be more than interested, in fact, more than willing to participate in forums under the FDA umbrella or this subcommittee's umbrella that specifically defines requirements for vendors with participation from industry, so that we indeed develop systems that actually are needed.

So, to summarize and conclude, PAT implementation, I believe, will be more difficult for the reason that it doesn't involve, simply, the next new probe. It involves leveraging other systems that I believe are deficient today in the industry.

The challenges are similar to those in other data-intensive decision making areas and that means poor availability of data. And by availability, I mean, real-time access to data, not weeks long and inappropriate software systems built for people who really aren't capable of using them very well.

A PAT provides a unique economic incentive for quality compliance. And I've talked about that a little bit. It's a way of getting industry to use their own inherent motivations to achieve the same ends as what FDA and this committee would like.

On the FDA part, I believe that it can be a catalyst for vendor/industry cooperation. Gathering data to show the real world manufacturing environment. What I've given you today is really only anecdotal, but it is my direct experience.

Publicizing the positive economics of PAT and providing the forums for interaction between vendors and the industry.

Thank you very much.

DR. LAYLOFF: Thank you, Dr. Neway. The next presentation is by Lie Peckan and Allan Wilson.

MR. WILSON: Good afternoon, my name is Allan Wilson, I'm with a company called the 20/80 Group and I'm an automotive manufacturing guy, is what I am. My background's in chemical engineering and statistics, but I came here today with my partner, Lee Peckan, who handles human change management in the automotive industry, which is another interesting thing to talk about all together--to talk about what we consider an interesting topic. And I hope you'll consider it, as well.

We've been involved in the transition and transformation of the automotive industry. Myself for a little longer than Lee, around 20 years. And we've begun to be involved in the pharmaceutical industry, for the last year. We've done a little bit of work here and there. And we find some really interesting parallels in the changes that have occurred in the automotive industry over the last 20 years and what you're undertaking--you're already undertaking and you're going to be doing moving forward, I believe in the next decade.

So the first question is why the comparison. Now, the obvious thing is the automotive industry has gone through some very unpleasant transformations and a lot of those transformations were very hurried and they were forced as I'm glad to admit, the automotive industry is not a monolith and we've been extremely susceptible to the flavor-of-the-month thing. Like, you know, if I were to rhyme off the number of little certifications and qualifications I hold in this and that it would be kind of terrifying. You have an opportunity, I believe, to take a more--a more measured look at your industry and take advantage of some of the learnings that have gone on in other places in the universe.

So the first thing is what happened to us back in 1980 in that time range? Basically three things happened. Three really unpleasant things happened and they happened in concert. Basically, we had problems around pricing, quality, and foreign competition.

The first thing is pricing. This is a chart that I picked up from public sources. This is the cost of crude oil over a relatively short period of time, sort of centered around 1980. That, in combination, with the fact that this is kind of what the typical car looked like at that time, as a matter of fact this looks very much like my buddy Don's Duster and it was actually a small car at that time and very fuel efficient. And people began talking about these things, you know, as these terrible gas-guzzling dinosaurs, so the domestic automotive industry began to come under all kinds of unpleasant pressure to find more fuel-efficient vehicles.

At the same time, just in case lack of fuel efficiency wasn't bad enough, we had some really significant quality problems going on, of which this was probably the most dramatic incident. You know, those of you who remember Ralph Nader. But there were all kinds of other things, basically around product quality.

So at the same time as we were seeing issues around cost, around product quality, began to be a ground swell in the so-called consumer culture of North America to change cars. Just in case life wasn't bad enough, along came Toyota. I still have a hard time saying that name, sorry.

And what those--what those scoundrels did was they delivered good, fuel efficient cars. And it really--it really shook us up. And we had to make some very significant changes and very painful changes. And in some cases very hurried changes, relatively speaking into a North--to the domestic North American automotive industry.

But when I think about the changes that have happened, consider this and sort of contrast this with the situation that you find yourselves in today.

Think about the typical automotive assembly plant that makes cars. What I'll call a mini automotive assembly plant. Twenty years ago, a typical automotive assembly plant would make about 800 cars a day.

Nowadays, a fast one will make almost 2,000 cars a day. At the mean--in the meantime, that same automotive plant, which used to occupy around 2.5 million square feet, has gotten much smaller. Actually there are plants coming off the boards right now that are under a million square feet, which, considering the amount of activity that goes on is quite astonishing.

At the same time, that work-force number--those are the number of people who, basically have to clean up messes. The kind of people who go, well, here's a car, oh, my goodness it doesn't work. We have to do something to it. And there used to be hordes of people at the end of the typical assembly line who had to fix things gone wrong. Hundreds on a given shift, 500 would be a typical number. Now, there are very few of them.

At the same time, that's product warranty, things gone wrong in around, you know, a three-month time frame. Then the things gone wrong have dropped, by over an order or magnitude. So, basically, when you buy a car today, you don't expect to have nearly as much trouble to have to take the thing back to the dealer by an order of magnitude as you did 10--well, actually, 20 years ago.

On-site inventory, these are basically the parts being held at a main assembly plant. The on-site inventory levels have dropped, essentially, by an order of magnitude. And, similarly, the incoming quality problems from vendors, because as you can imagine, you buy all kinds of bits and pieces to make a car. And there are very complex vendor chains. The incoming quality problems have decreased by two orders of magnitude.

So, I would ask you to consider two things, okay? The first thing I would ask you to consider how much, the car you drove here today with air bags that weren't there before; antilock brake unit, wasn't there before; an engine that you don't have to change the spark plugs for 100,000 miles, that wasn't there before. And how much that car would cost if this hadn't happened. Quite, it would be astonishing. Actually you wouldn't be able to buy that car. That car would not be able to exist because people wouldn't be able to afford it.

The second thin I would ask you to consider, is what would happen if you were able to trust your vendors more by two orders of magnitude. Or if you were able to run, basically, five times as much material through your existing Cyber-licensed facilities or CDER-licensed facilities, without having to--like I have a fair idea of the costs associated with licensing new manufacturing facilities in the pharmaceutical industry. That's the kind of transformation that the automotive industry underwent. And I believe that you're on the way to experiencing similar transformations.

Now, what I would like to do, once again, the automotive industry wasn't a monolith or anything, but what I'd like to do is, I'd like to hit upon six--six major changes that happened to us and it's the kind of thing that you can only look, at with the benefit of 20/20 hindsight, you know, because at the time it was kind of messy, you know. Like when Iacocca blew up that assembly plant on television, we all cried for three days, it was kind of unpleasant.

So, think about these, basically these three things. The first thing that changed is the understanding of what our customers wanted. The perception of who our customers were and what they were willing and happy to pay for. And we realized that the pharmaceutical industry already has a very strong understanding of that but the FDA is essentially a customer of the pharmaceutical industry. And how does that relationship play out.

The second thing is to be mindful of your competition. Know exactly what your competition is doing and I think that this is, for instance, a marvelous forum to drive a certain amount of standardization in the industry. The automotive industry is the most benchmarked industry on the planet. There are people benchmarking everything imaginable about automotive. And I think that's a very good thing.

Develop a strong focus on product quality and finished goods quality, which you already have obviously, but that was actually needed in the automotive industry, because the '50s consumer culture said anything you can make people will buy. So we actually needed to develop that an understanding of what our customers were willing and happy to pay for.

But these final three things, I think you actually, I think may be of more use to you. When I looked at the--at the situation in your industry, I said, well, what's your product life cycle? Basically, to simplify it, how long is a product in production and a typical number might be ten years. In the car business nowadays, a typical number is three years, but despite the fact that a given product is only actually made for three years, there's an absolute flurry of continuous improvement activity going on through that full three-year period of time, right up to the point where you stop making a model. You continue to improve it. And the reason is very simple--it's very simple because those improvements don't last just the life of that particular model, those improvements are fed back into design, development, and the launch process for the next models.

So you have a marvelous opportunity--a two-fold opportunity if we can find to make continuous work in your part of the world. Because, first, you have very long product cycles, so anything you can do to improve the quality, the reliability, efficacy that lower the cost of the products that you make, you'll have a huge period of time to realize the returns on that. So from the manufacturer's point of view, as well as the customers' if a way can be made to achieve that that will be extremely good. Plus, you'll be able to feed that forward to future endeavors.

The second thing is a focus on integration of effort. Car companies are big. Car companies are extremely big. Car companies have many vendors and suppliers who are also extremely big. Each of those entities has functional groups in them-- manufacturing, quality, engineering, marketing. In the evolution of the automotive industry of a greater working together type of evolution has come around in mechanisms of actual structures to manage the life cycles of products and to speed the life cycles of products and to actually force collaboration between functional organizations has come about in the automotive industry.

And I believe that that would be of great value to this industry, because, to start with, it's a huge industry, the companies are very large; the regulatory bodies are very large; the supply base is very large. And that integration of effort, in conjunction with improvement can be an extremely powerful thing.

The final element on this list is a redefinition of mass production. Henry Ford invented mass production so the automotive people thought they had it down cold until Toyota re-reinvented mass production and we like to think we've re-re-reinvented mass production over the last decade. And some of the key elements are elements that you have been talking about today, elements around understanding your processes; around process as well as process quality; around understanding the quality of the raw materials and designs that go into the products that you prepare; and, also, an understanding around error proofing of those processes and those products so that the likelihood of problems arising on an ongoing basis are drastically reduced.

And those are messages that I would like to carry to you from the automotive industry. If those things are possible; that they seem expensive, but the payback is huge because of the time that's working in your benefit, especially with the large, at least in certain areas, the very large product lifetimes that you have available to you.

Now this is just a brief slide around our perception about the similar things that are arising in your industry to what we saw, say, in 1980 in the automotive industry. We see pricing pressures; we see pricing pressures, especially in the United States around government bodies, HMOs, PPOs. We see the issues or regulatory pressures in your relationships with regulators. Those are pressures will continue. We see patent protection pressures, you know, the current paradigm around the expected life cycle of a product and the number of new products entering that life cycle pipeline. There seems, from our outside perspective, to be growing issues around, basically, the number of new drugs that are being launched at any given point in time and the ability to bury the R&D costs.

And we would submit to you is that manufacturing can give you so much more in terms of resources to then plow back into your R&D.

And, finally, foreign factors. You don't have a Toyota, thank goodness, but there are issues around the relationships with foreign regulators and with manufacturing in various points in the world and the need to harmonize and balance those things.

So this is our current thinking. I guess it's a little more specific around PAT, the prior stuff was rather general. In PAT, I see, basically, four issues that are challenges that need to be considered. The first is around learning to manage large volumes of data. And from a statistical perspective, there's an astonishing pitfall that awaits you and, I guess, we could, anyone who wants to talk about the technical details later, we can, but SPC is actually a survey sampling tool that's applied to manufacturing. If you do large volumes of sampling and you apply SPC parameters, which we had the unfortunate experience of doing, you basically drive yourself nuts.

Imagine your typical SPC run real paradigm says that you can have, I guess, a false out of control signal about 30 times in 1,000 by the time you stack up the run rules. Well, that's okay, if you only sample every shift, like Shoehart did in 1930 at Western Electric, but if you sample 1,000 times an hour, as you're likely to with PAT, then you'll need a different mathematical approach to handling that data and not basically drive yourself into a tizzy, which is something we experienced.

The whole measurements systems thing and I believe that you're already addressing that very well, but your measurement systems have to be reliable, they have to be accurate and a great deal of energy in terms of development and in terms of mathematical development and effort has been expended in the automotive industry around our measurement systems in order to understand how we're doing and be able to react in real time.

Process understanding--this really resonates with the discussions that you were having this morning. Process understanding drives the appropriate application of all this because if you don't know what to measure, then you're going to dump tons of resources into measuring things that you actually don't need. You'll expend these resources and you will needlessly chase ghostly images of poor product quality, so good product knowledge is, of course--and process knowledge is at the core of all this.

And, finally, one thing that I was very excited to hear this morning, was the notion of simplicity and parsimony in all things. There's a temptation that we fought in our industry for the last decade that's been brought about by the amount of high speed data acquisition equipment that's become easily available, relatively cheaply available, Sensor SCADA, so you find yourself able to measure all kinds of stuff and not necessarily knowing that you're getting value out of that volume of measurement. So, simplicity and parsimony and making systems such that errors are unlikely to occur, are extremely valuable. And the whole discipline of error proofing and pokeyoke [ph], which, once again, is a bit of a technical specificity is, I believe, of extreme value to the pharmaceutical industry.

So those are our final comments. Basically, that we believe Process Analytical Technology is an important step on the, I guess, what you might call the quality journey of the pharmaceutical industry and we believe that this is an excellent thing. Thanks.

MR. KLEVISHA: Thank you very much. I'd like to introduce myself and a colleague. My name is Dan Klevisha, I'm the vice president of Bruker Optics. I guess, in terms of PAT, you can think of Bruker Optics as a sensor supplier--a vendor supplier of sensors. My background is not in pharmaceutical analysis and production, but I believe that some of the experience of our companies have applied in different industries are relevant to PAT and justify this initiative. And our presentation will be in two parts, Tom Tague will present a second part. Tom is a senior applications chemist at Bruker Optics, in Massachusetts, and he'll discuss some of the strategies for partnering for development of new technologies.

So, I believe that it is very apparent and has been said several times quite eloquently that the PAT initiative is extremely important. I think you can take lessons from other industries and, certainly, apply it to the potential of PAT. Certainly in the chemical industry, the polymer manufacturing area, Process Analytical Technology was initiated and implemented as an innovation and a way to improve profitability and it has moved from that to being absolutely essential to compete in today's global market. And we see many examples in industries outside of the pharmaceutical area where I could call it the PAT equivalent application of Process Technologies have taken chemical companies that used to introduce a lot of off-spec material to the point that they've eliminated off-spec material creating an opportunity for the payback and return-on-investment of one in two months, in some cases.

And in other cases where the testing for analytical services for the polymer industry has been reduced from maybe 20 technologies down to just a handful of technologies that could be administrated to a whole plant by just a small number of people. So there's certainly a lot of opportunities for a great deal of cost efficiency in manufacturing.

The aspect that we would like to very briefly touch on is what is a vendor and the essential nature partnering to achieve the goals of rapid and efficient PAT. I think you can kind of break down many of the opportunities of PAT sensors into the application of what would otherwise be established technologies, such as Newark Thread [ph] for online driers and blenders and content uniformity measurement equipment that's been well used in laboratories and even in some process situations but is not nearly fully exploited to the potential that is available to the pharmaceutical market.

So these are more or less existing technologies that can be applied on a greater scale in, perhaps, innovative and unprecedented ways. And there's a whole area of new technologies that simply haven't been possible before and both present good opportunities for the future of PAT.

Our company manufactures vibrational spectroscopy equipment near infrared, mid-infrared and Raman equipment. And these technologies are highly applicable because of their nondestructive nature, real time analysis and applicability to a wide range of processes. So we're viewing this as somewhat narrow in terms of spectroscopy and specifically vibrational spectroscopy in terms of all the technologies but, clearly, there's opportunities across all aspects of pharmaceutical production for vibrational spectroscopy technology and other analysis techniques. And I think that's well understood within this expert panel here, so it doesn't need to be reviewed.

If you look at the paradigm of the polymer and chemical industry, you can see that a lot of the historical usage of process equipment was in the area of liquid analysis. Take a fiber optic probe and put it in a liquid stream or reactor, bypass line, something like that. And in the near infrared for raw material testing, techniques have been widely used. And those limitations, in terms of use, have been widely expanded over the last few years and I think will continue to do so with the possibility of putting, for example, fiber optic probes in blending and drying operations and the possibility of using non-contact analysis, in this case an FT-NIR system configured for drying measurements and being able to measure solid-phase materials very rapidly online has greatly expanded the use of online technology in other industries, and including in the pharmaceutical industry.

There are a whole range of tests that are conducted on a laboratory basis that can potentially administered online and to measure much higher volumes of materials and much more precise analysis than has been possible before and Tom will touch on some of that, including the administration of equipment that previously was limited to laboratory use, such as FT process Raman and now is readily available for use in a process environment.

One of the challenges, I think, going forward, with respect to regulation of PAT is how do you take all the various existing issues of how do you get equipment that does the analysis that's required to eliminate the need for more laborious, slower analysis and chromatography titration to what chemical analysis and implement all that in a way that can be compliant with all the regulatory requirements, such as 21 C.F.R. 11 and validation and IQOPQ and things like that.

And much of that requirement for instrumentation translates to the PAT environment but potentially gets more complicated and more challenging as equipment runs at real time in more demanding environments.

The--I mean, some clear aspects of PAT is that this needs to be a very broad-based technology usage and that I think you can almost--maybe it's not too strong a statement to say that some technologies are going to be more of an evolutionary implementation where they're already in use and understood and they're going to be used wider scale for process and there's also possibility of some fundamentally revolutionary stuff in terms of the instrumentation and the benefits that can be received from manufacturers in the pharmaceutical area.

So, I think I'd like to, at this point, turn it over to Tom Tague who will discuss a little bit of our concepts and strategies for partnering with new technology development.

MR. TAGUE: It seems like, to me, over the last 10 years or so when I've been involved with instrument manufacturers as a scientist, that the instruments are pretty much developed according to what the vendor tries to look at the industry and says, well, this instrument could be helpful and then we go through a two- or three-year development process and develop that instrument and then, kind of throw it over the wall and then in the pharmaceutical company you are kind of left with, okay, is this technology useful, is this instrument useful or are you trying to evaluate all the instrumentation that's available out in the market and say, how can I fit this into my program--into my quality program and--or is it too much work? Is the barrier too high.

And one of the things I hope that can come out of these types of meetings is that we have a mandate where pharmaceutical companies and vendors will partner together to so that the implementation of technologies takes place such that item 1 doesn't happen where you spend a lot of effort to work with a product that has been on the market for a few years and then, all of a sudden, a new product comes out that makes the technology that you just spent a couple of years incorporating into your own business is completely obsolete.

One of the things that is also very interesting from a vendor's point of view is to be able to partner with pharmaceutical companies and other manufacturers so that we can develop new technologies and new products that will not only help you, but help many different industries. And I feel that you, certainly, have the attention of all the vendors. And Bruker is, probably a good example and you could probably line up 50 different manufacturers and get the same exact words that we would be very excited in partnering and be willing to go that extra mile in doing so.

For Bruker, our business has changed over the last five years, in that, when we do development, we certainly have GMP, GCP, and GLP in mind as Dan alluded to. Our packages--all our software packages and products are intended to be fully 21 C.F.R. 11 compliant and that's through out interactions with you as being our customers.

And the 21 C.F.R. 11 compliance, I would say, for Bruker, is maybe the first example for us where we worked very closely with a few pharmaceutical companies to find out exactly what you wanted. Documentation was an issue, where at first we thought we could provide documentation that was always about 50 pages long and didn't really have the detail that was necessary and the end result, now, are very thick manuals that are very comprehensive and can withstand the most stringent evaluation by the FDA or, more importantly, by your own internal regulatory affairs people.

So we want to provide comprehensive support for achieving your validation and your monitoring goals. If we produce a product that can't be used in your laboratory because it is not compliance, then we've defeated our own purpose and we've not--and we've wasted your time. So software compliance and documentation are big for us.

One of the examples that you can look at to is trying do direct analysis of a reaction vessel, where, in the past you have to swab the vessel and then do an HPLC measurement. It can be a very tedious process and it's not a very efficient process. It can be tedious, is not really quantitative in nature. We developed a product here that is pro-base, where you can just go ahead and get the answer for how clean the reaction vessel is right away.

I wanted to emphasize the next few slides in talking about microanalysis. This is a really good example as to--with the ways things are and have been developed in the past, and to where they can go in the future.

The first is this little compartment, FTIR-based microscope. It's really good for identifying materials, compounds, contaminates, anything small, down to about 20 microns; takes up very little space; it is--works into being compliance with the instrument. You just observe the sample and do reflection or ATR data collection on it and it works very nicely.

But the next step might be to take this type of product and integrate it into a rugged, interferometer that could be taken right at line and be able to solve problems for you very easily.

This is an example as to what it is capable of. These are microbeads and you--there's some dark field illumination there and you can see, if you're familiar with infrared spectra, we're able to very nicely differentiate between the two different microbeads. Very easily done and the spectra collected in a matter of a couple of seconds.

Then we have a more research-oriented microscope and, again, this is an example of where this is a research tool that could be brought more into the manufacturing environment or many of the features.

Where you have very advanced visualization capabilities of a sample and then pretty much any mode of analysis to look at monolayers to doing ATR spectroscopy or transmission. And you have full data processing at this level, you can do chemometrics on--full chemometric analysis on a very automated way on the data that you might get. And this is just kind of representative of the tools that are provided by many vendors where they're research oriented, but it's through a partisanship that they could be brought into the manufacturing world.

This is just another example of the next stage up, which incorporates a lot of different automation features into the product. And I'll skip over that one.

The next item is chemical imaging and the reason I wanted to go through these last two examples is you can see in our own products where there's been an evolution, just as with PAT, there will be an evolution as to what can be implemented in how the collaborations, at least, from the initial stages will go forward. And, in this case, you can collect the data over a very large area, it may be as big as 6 mm by 6 mm in just a few seconds. So now you can talk about looking at the homogeneity of a tablet in a very short period of time.

And you're essentially only diffraction limited now, and you can perform chemometrics over the--globally over the whole data block. Someone--the previous two speakers have spoken about manipulation of data, this is a really good example as to what can happen.

You could take one of these imaging systems and put it right at line and look carefully at a tablet. The problem is that you can generate 100 megs of data with one--in one acquisition in six seconds. So how do you manage all that data? How do you get the information that you want out? There has to be partnership for how to get the answer without creating gigabytes of data in seconds.

This is an example of looking at a bone tissue. So if you wanted to monitor therapy and that was the purpose of this investigation. You can see visually the bone tissue and then look at the infrared image.

And then spectroscopically speaking, you can get at the information that you can't get at any other way and that is, if you look at bone modeled after two versus that of one-year, you can see that due to--in monitoring the amount of carbonate, which indicates the degree of mineralization in the bone, you can see the changes very nicely and you can see the changes over the whole tissue of the bone in the same manner you would any other large sample.

And then you can look at the chemical profile and, maybe this is one of the most useful parts of doing imaging and the numbers on the bottom are small, but those are microns from 0 to 660 across the bottom. So we're talking pretty small spacial resolution but getting a lot of detail. So, in this case, if you were trying to--in another case if you were trying to look at a tablet, you can, again, imagine that you have a 6 mm by 6 mm area and gaining this type of spacial resolution to find out how good your manufacturing process is.

And this technology is available today and it's available from more than one company. And yet I fear that it may be many years before this type of capability is implemented.

This is what it may--one rendition that we offer that certainly does that type of job. You have a macrosampling area, you can envision a tablet hopper where you run tablets in there and then, ultimately, I think--this instrument is not capable of it because you're talking about 1,000 tablets a second instead of a couple seconds per tablet, but I think that, from Bruker's point of view, and from the information we're getting, everyone would like to do every tablet. And so, with close partnerships, that's the type of information that can prove very valuable. In this case, we used state-of-the-art FPA detectors and video cameras to take care of that job.

Another interesting application is just, kind of looking at using FT-Raman analysis and saying, okay, how easy is it to look at raw materials? Well, you can look through vials and bags in the Raman in--with near infrared excitation, so your raw materials identification can be done very, very quickly. I've worked with a couple of pharmaceutical companies over the last few months at looking at these things and you can identify with unit efficiency, raw materials in a matter of just a couple seconds. So, it needs to be done and can be done.

This is another example of mapping the surface of a tablet using Raman. It can be done many different ways, you can monitor the concentration of aspirin very carefully. You can--let's see--I'm not sure why my Microsoft software's not showing that figure very well.

DR. HUSSAIN: Tom, we need to go on to--

MR. TAGUE: What's that? Okay, I'll skip over that. Essentially, you can look at cross-sections, as well as the tablet itself and also the degree of hydration on the tablet surface.

And, lastly, we also offer other products that are good for cellular analysis and bacteriological analysis and the detail that we've gone to, even here, is that, for example, with E. coli, we can readily identify in just a few seconds, almost a hundred strains--different strains of E. coli, just by streaking the bacteria on the zinc celinide [ph] plate, popping it into the spectrometer and getting the answer right out.

And, in conclusion, I think the future of PAT is bright, if people take action and the partnerships actually do take place. I think all sides are motivated. The FDA has called these meetings and appears more than willing to facilitate collaborations to--so that the manufacturing processes can be more efficient, more cost effective. And, certainly, the--I think you'll find that the vendors are fully motivated. And I thank you for your attention.

DR. LAYLOFF: Thank you very much, Dan and Tom. And we'll continue now on our questions.

If we could go back to page 4, I'd like to start back with Question Number 4c: To minimize or disputes should a priori criteria be developed to assess if a problem uncovered during PAT implementation was present all along during the prior manufacturing history of a product? Page 4, number 4c.

DR. MORRIS: Just a question of clarification, I think, Tom. When you say a priori, does that mean that the data already exists from the compendial [ph] testing to show that there's a problem? I'm not sure--

DR. LAYLOFF: Seem to have criteria established before you find the problem?

DR. MORRIS: But, I guess, because my next, because it comes down and says for current products that need improvement being considered case-by-case. But I'm saying--I guess I'm confused as to what criteria we're talking about.

DR. HUSSAIN: Let me try to clarify in this instance. I think, since we have heard so much about finding flaws--I mean, everybody seems to be saying that we will find flaws. And I'm sort of looking forward to saying that let's maybe define some criteria whether something is a flaw or not a flaw, you know. When you start with a product and a good compliance history, is that enough to say, that's fine. Whatever is there, availability is observed is fine. Is that enough or should we try to do something before to have potentially avoid any disputes or disagreements?

DR. LAYLOFF: Set a threshold for the dispute?

DR. HUSSAIN: Yeah.

DR. MARK: Well, during one of the breaks, we got into a discussion with some of the other people here and brought the question of, well, suppose one of these process--I guess maybe sort of jumping the gun on this question, but in response and discussion on some of the earlier questions, the questions came up, well, if something shows up because you applied the process analytical technology, what is the company going to have to do about it? Are they going to have to jump on it right away and put a priority over all their other projects that ar going on? Are they going to have to put it in the stream of research projects and take care of it in due course, considering that they have been manufacturing this product, you know, in what was considered a satisfactory manner until then? And that's probably going to be an important decision that's going to have to be made in terms of the guidance as to what's going to have to happen?

MR. FAMULARE: I'm sorry, I think it's a decision you're faced with under the current paradigm when things become revealed to you that there's a product, whether PAT gave it to you, whether current validation and conventional testing gave it to you. You know, we touched on in question 4b, risk-based and what does that really mean, in terms of the application of GMPs and problems uncovered during PAT R&D efforts. And, again, as a company that's responsible, maybe risk-based wasn't the best term to use here, but when FDA, for example, sees a problem, first of all, it has to pass the, well what difference does it make question, you know, Does it really make a difference in the process or is it a just sort of a specification that maybe we could re-evaluate in terms of whether it's in the application process or whatever.

The second issue is, what is the public health impact, maybe, as opposed to saying risk-based? If we see products that don't meet their legal specifications, they're in violation. We have ways of dealing with that. If it's a minor issue, the company comes back with a corrective action plan and it's the timing of it is appropriate to the meaning of the violation, or if it does have an effect on public health impact.

If it's a violation that may cause them to question the prod being on the marketplace, we have a criteria that we look at as to whether what the public health impact is and should it rise to the level of a recall? There's different classifications of recall. So, I don't think it's a new paradigm, I just think it's another factor that goes into that paradigm.

DR. HUSSAIN: It's not a new paradigm, but there is a difference. The difference is under the current system, the product has no compliance problem, it is on the market. So when you add a new approach, then that--something becomes visible. So it's a different question in a sense. Under the current paradigm, there's no issue at all. So for this--sorry, for this class of problems, quote/quote, "problems," I don't know what whether we should even call them problems because they're not problems. So these are really variability and observations which are not visible under the current system.

DR. MARK: So, it does go deviation handling, Ajaz, and that goes to compliance, that goes to regulatory issues, that goes to, you know, what we were talking about this morning. So, suppose the damn sensor goes off and it goes wacky, okay, so there's 5 minutes or 10 minutes of product that's being redirected into a different stream of product collection. Now, how do we handle that deviation, do we go back and find out that, okay, the sensors off, so we get that fixed, we then do we take the material and put it back through the system or do we test it by the current applicable, approved methodology? These are the nuances--

DR. HUSSAIN: That's a sort of different example. What we're talking about here is, a company is willing to put PAT on line and go through the validation development process. I think the better example is one which is was supplied to us by G.K. Raju, and it's in Janet Woodcock's presentation. And the example simply is, a company would like to--and it's a real-life example, it is the data that he has supplied--would like to do online blending uniformity analysis. And now, when they are doing this in an R&D efforts they're using the same product, the same condition, but not in the manufacturing setting. They see non-normal distribution or trends which show that the current blending, well, as it's being manufactured, may have some deviations there which are not visible under the current system.

It's also an example that I'll sort of ferret you, as in the sense, due to the PQRI blend uniformity process, a company, which will remain nameless, wanted to do the stratified sampling and get data to support the PQRI proposal. A validated product, on the market, meets U.S. content uniformity, a history of that--even meets the blend sampling analysis, without any problem. So when they did the stratified sampling, they found towards the end of the run, there was a deviation. That happened. But they did that only to sort of be nice and give some data to PQRI, so are they in trouble now? So they send the data to me, I didn't hide it, I showed it to you guys before--to fix the problem, right, and how. And that's the scenario we're talking about.

MR. FAMULARE: Again, the aim is, you know, there's so much focus on what will happen in compliance, but the aim is to go to product improvement and if the compliance enforcement policy is such that it penalizes you, then we will have defeated our purpose.

DR. BOEHLERT: I was going to point out these things happen now, as Joe says. And, you know, I've run into situations where somebody, in analytical testing, say, has set a limit on impurities of 1 percent and they've been testing it all along and find less than 1 percent. They improve the method and now they find out they have 2 percent. It doesn't meet their filed regulatory specifications, so the question is, has it been there all along?

And the best you can do, very often is infer that it has because you haven't changed the process and you haven't done a lot of things, but you can't go back and reanalyze samples that are beyond your retain period. So the best--you know, so you might want to talk about an approach on how you might handle those situations but, in fact, people fact it all the time right now.

DR. HUSSAIN: I think that's a very good example. And maybe--let me try to answer that question, maybe that'll help the committee.

The current testing paradigm that we have and so--which is the sort of limited testing and release, so we don't have--we have data which is very limited. A company wishing to do PAT on line my say, all right, we are going to establish a baseline which would be, collect all the information, the history of that product and so forth, and have that available, too, for discussion because that becomes a baseline for that product already. And that any deviations that are apparent under the new system are either corrected or not corrected depending on if it should be corrected or not.

But then the reference point is the baseline data. That maybe a bit more data that they need collect.

DR. MORRIS: Can I ask a question then? And maybe it's for both you and Joe, Ajaz, but 4c, then, is sort of a question that differentiates the way we are currently handling, say, out of trend data that are not part of the manufacturing process compliance testing now, versus how we would handle it during PAT implementation. Is that a fair assessment?

DR. HUSSAIN: In my mind, it is, because, in my mind, you're trying to create a whole new team for review as an inspector. And in some ways a new process for handling all these issues--

DR. MORRIS: Right, I mean, because--

DR. HUSSAIN: --I mean, hopefully, a better, more efficient approach, so.

DR. MORRIS: I would just say that the idea of a safe harbor doesn't, I mean, what you're describing, Joe, is sort of what's done now anyway. And all I'm saying is that if you have foresee implemented somehow that does differentiate in terms of extending the harbor, if you will.

MR. FAMULARE: That's true in the sense that, as Ajaz, started out before. It is a new paradigm, you know, all we have now is based on the conventional methods of analysis. So it is a new paradigm because you're voluntarily introducing a new factor towards product improvement. So it would change--that's why, as Ajaz says, we have the team approach, training that we're going to be talking about, et cetera, and ways to deal with that new paradigm.

DR. SHEK: Yes, if we go back to the specific question whether a priori we should have some guidelines. I just listened to the discussion, in real life, I don't know whether that will be practical until you don't go right and do the test. And the scenario's a little bit different today. We are changing, let's say, the sensitivity, okay, of the tool that is in our hand now. So by definition, we are going to see things that we haven't seen before. Then you have to go and ask the question what does it mean, right, whatever we call it--is it important or it's not important? And that's going to be based on case-by-case. So the best approach would be, of course, we are taking the understanding could be both from the regulatory, you know, agencies, as well as from the manufacturer. It has to be case-by-case and understanding what's really is happening there. Today, we establish specification based on analytical tools that are in our hands and based on the process capabilities. And we might come to a situation where the PAT will indicate to us that you should go some direction, and maybe the process would enable you to reach there. This thing happened over and over again in analytical, right? We developed sensitive, you know, perfect separation, but we didn't have a detector who can pick out those differences. Then we came out with super detectors who were able to now, improve on the columns.

The same thing might happen here, but it might take, maybe, a little bit longer. So the most important thing is to have this dialogue going on and understanding what PAT is doing and what we are observing there, but to come out with a priori rules, I think it will be difficult.

DR. HUSSAIN: With respect to sort of impurities, in the sense, in one sense because of its presence all along, it's qualified on that basis, I mean, so that would be the approach.

So in some cases, the availability of whatever flaw we see, would that be qualified on the basis of historical presence. And that would be one approach.

So there wouldn't be any issue remaining at all.

DR. KIBBE: Just--I'm having a hard time imagining these disasters, which--but I'm going to work my way through it. If we put a new way of monitoring a process in place and because we are ethical manufacturers we want to always improve our process and we suddenly find that a certain portion of our batch is always out of compliance. Okay? Now, what does that mean to the end user? And, as a company, first, you're--you should be overjoyed that you found this problem, because you now will then be able to not disadvantage your end user, okay? And finding it, whether we use PAT or we do something else or somebody else finds it for you, it still has to be remedied.

My personal opinion is, however, that what you will find are things about your process that really don't disadvantage the end user. You'll find things out that are within the general scope of what we've used as a way of clearing each batch already. And any change or any variation which does not exceed the batch requirements that we already have in place, are yours to deal with and not the regulatory agencies to deal with. They're not going to say you have to throw that batch out because you have some variation within it that's still within the framework of how you get the batch approved.

And if we do find that the last 600 tablets of every batch you ever made are junk and should be canned. Then we're going to ask you to fix that because we don't want you to keep sending out those 600 tablets. And you don't want to do it either, all right?

So I think--

DR. MORRIS: I think that's the easier case, though, Art, I mean, I don't anybody's arguing that. I think the question and--on the table, really deals more with paving the way for companies to try. If they don't want to have to trigger and OOS investigation if it's one of the variations that you're talking about that doesn't effect the end product.

DR. KIBBE: And that's what I'm saying, if it doesn't affect the product then that's where we draw the line.

DR. MORRIS: I mean, I can't speak for Ajaz here.

DR. KIBBE: You don't have to do anything else. If you find a variation and it doesn't affect the quality of the product, vis-a-vis the standards that you've already established for an ongoing product, the agency isn't going to make you do anything outstanding.

DR. MORRIS: Well, then, maybe that's the a priori criteria then.

DR. KIBBE: All right, and then if there is an impact on the patient, you better do something because once we find out, we want you to do something.

DR. HUSSAIN: In that regard, I think the proposal to--sort of the definition of safe harbor covered that and you'll not need anymore things, okay.

DR. LAYLOFF: I guess--did you want to--

MR. HALE: Yeah, just a quick one. The a priori part of this in my mind is if you're going to make an effort to collect more data, what is the purpose of collecting data in the first place? I mean, putting a sensor on for the sake of putting a sensor on, makes absolutely no sense unless you have a plan that you're going to look for something. And it goes back to the idea of development, even if you're in manufacturing there needs to be a purpose up front for doing it. And if you have a purpose, then that thought process should be played out that you're--that you have a process to react to the data. I mean, putting on for the sake of sensors doesn't make any sense to me without a thought process that goes into it.

DR. MARK: Yeah, it's sort of--a good point, it occurs to me that if you say, okay, we're going to use this to improve our process, you're sort of saying, we're going to use this as a way of telling us when the process is not as satisfactory as it could be, if you will, and that's almost tantamount to defining the process as being unsatisfactory then, which his a danger or a pitfall you might fall into, if you're not careful and it could, you know--you want to make--to improve the process, but you don't want to say, well, the process isn't satisfactory now, because we can improve it.

MR. CHISHOLM: Yeah, I'm just thinking as I do, if I was a man from Mars and landed in this, I'd be listening to an industry that's absolutely scared because it doesn't believe that its existing process isn't good enough. And I think, as an industry, we better be somewhat careful of that. I totally support Arthur across here. We're an ethical pharmaceutical industry, if we're putting rubbish out there that ain't helping the patients, we want to know about it.

I think the problems you're talking about relate to other things. I think that we will have more trouble with our internal regulators and QA people sticking to your rules than anything else. I mean we'll be simply embargoing and putting into quarantine more and more batches if we're not careful. And that's up to us as an industry, to sort out. We've got to sort out both sides of the fence.

The thing that concerns me is I think we're moving away from yes and no to maybe, which I said, this morning, but I mean, if you try and define that a bit better. But it's when you actually want to control the process. If you are finding something's going slightly wrong at the end of a batch. And I'm quite sure there must be lots of examples of it, through blending, et cetera, et cetera, et cetera. Do you have to throw away the whole batch of which 95 percent might be good. In the currently existing situation, you would. These are the sort of questions I think that we have to address. Because if you're actually monitoring quality assurance in real time, then you know when it's going wrong. You know what's wrong and what's bad.

And, in fact, if you got to stage in a tablet press towards the end of a batch that was going out of spec, stop the process. So we have to try and think in a different way. I mean, I'm a control engineer by degree, think more along these lines and away from the old yes and no and into the control system philosophy. And we all have to do that, I think. And stop painting this dead lakes scenario I keep hearing.

DR. LAYLOFF: I have a comment. I think we've drifted off a lot into what is possible, rather than to what is probable. I reviewed, not too long ago, the content uniformity data on 10,000 different batches analyzed in one FDA lab in St. Louis and it was quite striking how consistent the products were and how few there were out of limits. I don't think there was a big elephant out there that people are going to trip on. I think what we're going to see is efficiencies in production. I think we'll see a better consistency in product, but I don't think there's an elephant out there that the industry's missed. It's a very good industry and we've got lots of good product out there and I'd like to move on to the next question. The next question? Yeah, go ahead.

DR. RUDD: Yeah, could I just briefly add one comment? And, actually, to re-enforce what Bob as said. We need to think much more about the positive aspects and not get hung up on the potential ghosts and shadows that are out there to catch us. Can I give one example of where I think we could implementation PATs overnight. We've had debate, already, about, you know, is it a development thing, is it a manufacturing thing? And we all have view on that, but we could implementation the following example overnight for existing products.

And I'll call this a hypothetical situation, although it may ring true with some of us in the industry. Not with GSK, I hasten to add. But imagine a liquid suspension product, where the bulk suspension is being filled into unit containers. We've all had experience of homogeneity issues there, whether it's sedimentation, foaming, flocculation, that sort of thing, such that it's possible that towards the end of the filling run, you may have to discard a certain amount of the bulk or the fill material because it's subpotent material or superpotent material, it doesn't happen in GSK, but some of us may of products like that.

The current approach that we're using is to play safe, you know, we do some validation studies and we say, okay, the last 10 percent maybe is at risk, so we'll reject and discard the last 20 percent. Kind of rule-of-thumb there, you know, that's the way we solved the problem at the moment. how about, overnight, if we implement a PAT approach, if we put some fiber optic UV, fiber optic approach in there and we continue filing until the point at which we begin to get close to subpotent or superpotent material, that may be 20 percent on some occasions, it may be 10 percent on other occasions. It may be 1 percent on subsequent occasions. So, without investigating the process, without doing any development work, simply by implementing the measurement, what we've achieved is a level of control, the thing that Bob was talking about. And that level of control has improved our process, because now, we're not working within this, you know, belt-and-braces, safety barrier of rejecting, for example, 20 percent every time, we're rejecting the amount that needs to be rejected and probably, routinely, that would be a lot less than 20.

So the point, really, is just to recognize that there are different levels of implementation here and we really shouldn't get hung up on the potential risk and ghosts and shadows. Let's look at the positive bits and let's do those if they're quick and easy to do. Let's do them and let's do them overnight. Thanks.

DR. HUSSAIN: An excellent example, I think, sir.

DR. RUDD: Purely hypothetical and not GSK, exactly.

DR. LAYLOFF: Okay, going on to Question 4d, What other mechanisms do you recommend for consideration? We've pretty much beat that up.

DR. HUSSAIN: Just, I think I want to sort of bring a perspective up. Listening to Bob and David and so forth, I think a lot of these questions we were driven in this direction because every meeting I have been to, every place I have been to is that's the only question, the flaws, the flaws, the flaws. I mean, I'm getting scared here. I totally agree with Bob, in this instance. I think we need to focus on the quality of this. These questions were. sort of, with that mind set in mind. So.

DR. MORRIS: Just follow up on that, though, I don't think you should be too hard on yourself here, only because not representative of any companies, but GSK and perhaps AstraZeneca, I mean, these are--these may be lower energy barriers for PAT implementation in other companies and it's the companies who aren't already sort of embracing the mentality that you don't want to scare off.

So, I don't, I think it's fine to address them, I think you have to.

DR. LAYLOFF: Okay, let's move on to Question 4e: What are your recommendations for training needs and criteria for certification of the proposed PAT-Team?

DR. HUSSAIN: Let me just share with you the process that we have been engaged in in this instance. We have talked with three universities, CPAC, the University of Washington, with Mel Koch and the University of Purdue and the University of Tennessee and the Measurement and Control Engineering Center, and essentially, we plan to work with these three schools to put a curriculum together.

And out of some of the discussion, I think the outline of the proposal that we liked most was from Kelsey Cook, from Measurement and Control Engineering and that's what we included in your handout. I think this is a very important sort of item for discussion in this committee. And what I would like to sort of have the committee to just discuss this broadly and sort of give directions. Certainly we will have a working group on that with Ken Morris sort of chairing that group. Maybe give directions to this group and what they should be focused on in developing that curriculum.

DR. LAYLOFF: And so, we're looking for suggestions for Ken Morris and the Education Group, which will be meeting after our break.

DR. SHEK: I went over, I went over what--you know I think attachment, I think PATs 2, looks, I think, very, very good. A lot of thing I observed missing there, there is a section there on pharmaceutical chemical processing fundamentals, but there is nothing about pharmaceutical, you know, drug product and I think it's extremely important to understand the processes that at least today the industry is utilizing, and it's a very light spectrum. I mean, there's controls release and regular, you know, but I think it's really important to understand the processes and I think this is may a big chunk which is missing there.

DR. HUSSAIN: No, actually, we had internal discussion and we had--we sent this packet out earlier and, actually, we added some of that in.

DR. MARK: Yeah, my comments here aren't directly addressed to the question, sort of they're addressed to the level above it. Because I'm wondering, for example, when the FDA decides to do something like that--to go into a PAT type environment, does it have to go into it all at once and trained all their inspectors at one time or is it possible for them to run some sort of a pilot program where a few inspectors can be trained the performance assess to see where the weaknesses of the training program are and spend some time developing the training program as it's sort of being tried out in a small scale.

DR. HUSSAIN. Let me just clarify that--in the since we have been sort of--the plan that has been discussed earlier, essentially, is we have identified four reviewers, four inspectors, it says a small subgroup only at this time, so.

MR. ELLSWORTH: Joe wants me to add to that. Yeah, I think in discussing the training, especially for the field investigators, we're limiting it to a certain extent because we need to develop the expertise, but part of this training, I think we're going to be learning as we're doing this, so we want a kind of greater control over the interpretation that occurs, but long-term, we're going to need to train a lot more and we have a whole drug investigator certification program that we're developing and we're looking at a higher level. We have level one, basic investigator, level 2, which is a fairly extensive drug program, and probably a level three will bring in the PAT expertise. We're looking at that now.

DR. SHEK: But my question is, maybe, once we have the curriculum, who are going to be the teachers?

DR. HUSSAIN: What we would--we'd be looking for is the professors from these universities with invited folks from industry who would come and give case studies and so forth. And I think what we envisioned right now is the professors would come, teach in the Rockville area, so that I think we'll bring our reviewers and inspectors together here and then, hopefully, have hands-on experience at different locations. Maybe some companies would offer some hands-on experience. I know Purdue has offered their lab. So the core group would travel to these places and do the lab themselves.

MS. SEKULIC: I was just having a look at what's listed here, although it covers most of the scientific and technical aspects of what one would require. Two comments, I guess: I don't see a lot on the infomatics side, the software components, the validation. If we are to be developing some of these new technologies, then the partnership with vendors, that's a practical concern that we currently have and that could potentially be a hurdle. I believe that deserves a little bit of attention in the training component.

And I also, like, I think, Ajaz, you mentioned or somebody mentioned earlier this morning the mock sessions, I mean, sort of like play-acting scenarios--I think that's a great way of training individuals and we do a lot more of it in industry for various other reasons, but I think that that's a great training tool of actually putting people in pre-designed situations. It's a great motivational tool, as well.

DR. MARK: It occurred to me that if an inspector is going to be inspecting new technologies, they should certainly get some training and expertise and possibly by--by actual experience in real cases of using that technology and developing a method, you know, with that technology.

DR. MORRIS: I think the plans are that if they--depending on how it works, but I can only speak for Purdue at the moment, but, I mean, if you come to Purdue to work on the sensor-based lines that we have, nobody is idle. There's--everybody would do hands on is my vision of it. That the didactic part would be here, but that the practical would be at the universities and hands-on, you know, and I'm assuming that that's the case to the extent that it's hands on with--

DR. KOCH: Yeah, I guess if you're getting to some of the discussion we've had within CPAC, we're assuming to take a role that exposing new measurement technologies and sensors that have been successfully in other industries or some evolving technologies and then have case studies involving that and the data handling that comes from industry participation, as well.

DR. HUSSAIN: The aspect of, I think, pharmaceutical industry participating in the training program, I think that would be feasible. In fact, we felt that the three schools could partner with some companies willing to partner and then have that axis, but through the universities rather than directly, that would be one options.

DR. LAYLOFF: Yeah, I'm sure that the knowledge base is primarily in the industry.

DR. KOCH: I think one thing I might add is I've picked up in discussion with various pharmaceutical companies, a tremendous interest in the later phases of this of them wanting to have their employees participate in some level of this to hear what it is that the reviewers are hearing so that there's a commonality in the language and the success.

DR. LAYLOFF: Yeah, I think, probably one of the greatest incentives in PAT in adoption in the industry is having FDA go out and get trained, because then they'll all want to get trained also, drive everybody.

I guess we can move on to the next question, Number 4f, on page 5, it has to do with mechanisms for review: What other mechanisms for both NDA and ANDA do you recommend for consideration by the agency that a new drug development process may not be delayed due to the use of new PATs?

DR. CIURCZAK: If I can comment, because one of the things that we've always done with near infrared, for instance, is you have to have a validated method backing it up. And anybody I've ever recommended it to is get your NDA in with your standard HPLC and everything else and then send your NIR method in as an amendment. The same thing could be if everything we're going to do in process is still going to need a backup method, NLC or a Carl Fisher or something else that you're going to calibrate it with. If you're afraid of delaying your NDA, you might, just as well, put your NDA through with the classical assays and then phase in either all at once or several each month or a year whatever down the line to go to PAT.

DR. HUSSAIN: Emil, you just redefined the risk that we are trying to address with the questions--

DR. CIURCZAK: Well, in any case, this goes back to scaring, you don't want to delay because the financial thing down the line and I don't think anybody's afraid that their products are bad, you know, down the line. But I think you have a lot of financial people up there saying we have a limited lifetime, if this adds six months, nine months, a year to it getting approved, we're going to lose a bloody fortune here and be open that much sooner to competition, so, if you're going to have to develop traditional methods to validate these all anyway, you could always, if you're afraid of putting any of the PAT through, just do your first NDA that way.

And just one more comment, from my experience, the three batches that you have to get an NDA in, they're usually not enough, they rarely give good process information anyway. We like--I like to develop my NIR methods on, like, a year's worth of batches. So, go get your product out there and then start collecting data.

DR. LAYLOFF: I think that the training program for reviewers and inspectors for PAIs will help considerably. Also, I think the open door policy that Ajaz has espoused that, you know, you can come in before hand, you can come in and discuss it, and actually work out the details on this before the submission. I think that the trained cadre, plus the open door to discuss these issues before the NDA actually hits will get around a lot of that.

DR. HUSSAIN: In a sense, I think one of the proposals that we have is we could actually structure and have special separate meeting with IND and stage at phase II where the concerns would not be an issue. So I think, I don't want to take the negative attitude or get the NDA out and so forth, I mean that's all we are thinking I think we can do better.

DR. LAYLOFF: I think rather than dropping it over the wall to actually come in and discuss it would actually be better than kick it over the wall. Okay.

Going on to Question 4g: What other clarifications should be included in the general guidance on this subject?

DR. LAYLOFF: It goes to Risk 4 which is that this would be a requirement and we're saying it's not a requirement and this is voluntary. And we want to state--that will be stated in the guidance that this is voluntary, and so forth, should that be--we hope that that will be enough, so.

DR. LAYLOFF: Will that be enough?

DR. BOEHLERT: I hope so.

DR. LAYLOFF: Judy says she hopes so, that's good enough for me. Going on to question 4h--wait a minute, did I just do that? No, that's it, yeah: What other approach do you recommend for consideration to address this concern? And that is, will the company need to use both PAT quality methods and conventional methods for regulatory purposes forever?

[No response.]

DR. HUSSAIN: To give you an example, the case study I constructed with the dissolution, doing dissolution with online assistance and so forth. The criteria could be you have established a correlation and to some degree, you have actually explained that the correlation is just not a black box, it's related to the formulation variables. And if that is acceptable, then that becomes the routine method. And so, dissolution testing for release may not be necessary at that point. And you may need to do dissolution for stability and shelf-life determination only, unless you have a method that even picks that out, so--

Dr. KIBBE: Let me address that one other little thing that we kind of talked about a little bit before we went to lunch and I think applies in here. And that is, there are times when what the Agency is willing to accept is not everything that a company feels it must do in order to get approval at various places and for various purposes. It's always good for companies to be able to carry a USP imprimatur for marketing sales reasons and what have you and if the PAT allows us to bag dissolution testing but then they can't say that they meet the USP monograph and things like that--and I think there might be an opportunity here--I know you're going to correct me--

DR. LAYLOFF: Okay.

DR. KIBBE: --okay, but I know that companies think about doing extra things to get different kinds of classifications. And whether it's the USP or something else. And one of the things that we need to consider here as we move forward with PAT is how does the Agency get actively involved in making sure that anybody else who's regulating or whose approval is useful to the company is being brought on-board with us, so that if we move forward with a certain kind of acceptance level for PAT, what is the Agency going to do with it's colleagues around the world to make sure they're moving forward. That's where, I think the only other approach that we need to take in this area is. Okay, now you can correct me.

DR. LAYLOFF: Okay. A USP product has to meet the USP standard if tested. So if you have a process of assessing dissolution and you validated it and you released product without doing the dissolution test and you have stability data showing that it will meet it throughout the lifetime. If tested in the marketplace it's presumed it will meet, if it doesn't meet then it's an illegal product because it doesn't meet the standard. So you have to establish a validated process, you have to have stability testing, but you don't have to do the

USP tests.

DR. HUSSAIN: In many cases, I think--or in most cases, you will have a traditional dissolution test established for that product, so you'll have that, but you don't have to do that on a routine basis to release the product.

MR. : But you have to do it to have USP on the label.

DR. LAYLOFF: No, no you don't. Judy, tell 'em Judy.

DR. BOEHLERT: If you manufacture a product that has the USP monograph then, by default, it is a USP product, you need not label. You need to label the product, if you want to declare it non USP, that's a fact. USP, in the general notices, allows you to test it by other means. That's allowed. And so what Tom says is absolutely right. You know, you need not test, but if the product is picked up in the field, it must meet. So you need--if the USP method doesn't work, you've got a big problem, if your product fails the USP method, you have a big problem and you need to address that, but you need not test by the USP method and you need not label your product USP, it is USP if there's a monograph. You need to label it if it's not USP, and there are products out in the marketplace now that are labeled non-USP.

DR. LAYLOFF: But there has to be a rationale for non-USP--

DR. BOEHLERT: There has to be a rationale--

DR. LAYLOFF: --it just can't be arbitrary.

DR. BOEHLERT: --and you have to put on the label why it's not USP, I believe.

DR. LAYLOFF: Right, okay.

MS. CHIU: Even today without PAT, not all the products are released based on USP tests, because under our regulation would permit alternate test for routine batch release. Now, alternative test needs to be equivalent or better than the regulatory test which could be the USP test. So, therefore, with PAT, if you have validated your technology and to be equivalent or better, then standard dissolution test, you won't need to do that and based on the validation data, you are sure, you know, every batch will meet the USP test, which is lower standard.

DR. LAYLOFF: Another thing, and I think--and that is if the--having worked in FDA for about 20 years or more--one of the things that you find is that if there is an FDA approved standard and a USP approved standard, if a product fails a USP standard but passes the FDA approved, compliance won't take an action. If it passes--an FDA-approved standards. If the USP standard changes but it still meets the NDA standard, you're going to go with the NDA standard. So the--in general, if something's going to happen compliance-wise, it's going to fail both.

DR. CHIU: Yeah, that's true, in either NDA or ANDA we have a regulatory standards which may not be the same as the USP standard, but it is always better higher than USP standard. At least it's--if it's equivalent then they will issue a USP test.

MR. CHISHOLM: Okay, I'm in a fortunate position of not knowing what USP is.

[Laughter.]

MR. CHISHOLM: And in the industry I came from, petrochemicals had to do with piping standards, actually. Coming back to NDAs and the problem is all about the size of the data set because, as this gentleman across here said, you've still got to have traditional methods to actually model in the first place. So you've had to do that work, the problem is your data sets aren't large enough and when you scale up, you have to expand your models.

I think you have to--when you make your--and this is just a suggestion--when you make your submission, you have to have the methodologies in and the work done at the lower-scale level. And that will be done with raw material specs, it'll be done for blending, and then it'll be done if the quality assurance side tablets in terms of active content, whatever you're registering.

You then have to build in that. You won't actually use that for product release until you can validate it. You can't validate it until your data sets are big enough. So I think you're forced down that line whether you like it or not.

DR. CHIU: Well, I think that's a good point because of a compressed depression time, you may not have enough data set, however, this applies to many other things as well. As it says in specification, you know, we had a big workshop and we discussed, you know, during the development time, you may not have enough data to establish the true meaning and the 3 sigma so, therefore, you won't have, you know, the right acceptance criteria. I think that it will also apply to a PAT with limited data, as Jeff mentioned this morning. Maybe there's some kind of change control, or a post-approval commitment, then we can set something interim.

DR. LAYLOFF: I have to tell a little story. We were doing the Prednisone in vivo in vitro correlation and we found this one product which failed the dissolution standard but which was bio-available. But it was an illegal product because it failed the USP limit. We never took an action because we thought it would be very awkward to go to court and see somebody's product we knew was bio-available, just because it was a technical violation. And the guy who did it said he wasn't going to reformulate because we had demonstrated his product was good.

Going on to Question Number 5: What information should be included in the proposed guidance on product process development and percent analytical validation?

DR. HUSSAIN: The way we phrased that question, that becomes sort of a working group question--

DR. LAYLOFF: Okay.

DR. HUSSAIN: --this is a broader question. And I was hoping is you'll use sometime here to define the charge for the two, three working groups and let them go at it.

DR. LAYLOFF: Okay, so we have defined the charge for instructional program pretty much?

DR. HUSSAIN: You already did.

DR. LAYLOFF: Yes.

DR. HUSSAIN: So, the two working groups processed and--and validation.

DR. LAYLOFF: All right, what do we want them to look at? We have Judy and Art chairing those committees.

DR. HUSSAIN: For starters, I sort of posed questions this morning. On the two pages, you have those. That could be one. And on the back of my handout, I have a list of questions that we received from Merck. So that could be the set of questions. One approach could be here are the set of starting questions and the technical folks will get to the working group, use that and sort of start defining their charge themselves, that could be one approach.

DR. LAYLOFF: What are the questions that you handed out and the questions that are cited in your presentation for guidance. Is there anything else we need to discuss before we break for the sessions?

MS. REEDY: All right, the break-out rooms will be supplied with the break food and drinks. So, the ones in this room are for Room A. And Room A in this room will be Process and Analytical Validation, chaired by Dr. Kibbe.

The next room, south here, is Room D and that'll be Produce and Process Development, chaired by Judy Boehlert.

And the last room, at the end of the hall, Room E, will be Analytical Technology and Training and chaired by Ken Morris.

DR. LAYLOFF: If there are no other items then we will take a break now for 15 minutes and reconvene in those rooms and not reconvene here today, but reconvene here tomorrow morning.

DR. HUSSAIN: And the working grouzp members in the audience could choose to whatever group they need to go to and they would like to go to, so--

DR. LAYLOFF: But the training group we wanted all the academics to go to the training room.

DR. HUSSAIN: Yeah, correct.

DR. LAYLOFF: All the academic people are banned to the training session and the others may choose their own session, and we will reconvene here tomorrow morning at 8 o'clock. So you have a 15-minute break now; go to your session. And then, at the end of that session, you're free for today.

And then tomorrow morning, 8 o'clock here.

[Break.]

DR. KIBBE: It's my hope that we would all spontaneously want to get together and carry on this afternoon, that people would immediately want to stop doing whatever they're doing, which I'm sure is extremely valuable to get back to do what we think we need to get done. And so, rather than calling you to order, I'll welcome you back.

This room, we're going to validate process analytical tools. Would you like to validate?

MR. : Yes.

DR. KIBBE: Hi, guys. Look at them all hiding back there. All right. This is a subcommittee and the purposes of it is, of course, to review some of the information we've done in the past, some of the thinking that we've had in our various meetings and come up with some recommendations for acceptable guidelines for validation of the PAT processes that might be put into place.

And I see around the room experts in validation, I can tell by looking at them that they probably know so much more than I that they're just going to leap forward and give us the correct answers. I have a very simplistic way of establishing a valid analytical method. You do it, you show it to me, if I like it, it's valid. That's the old FDA method of approving anything. But we're going to try to be a little bit more scientific and actually come up with criteria.

So they didn't give me any speeches to make and I'm a university professor, I can talk for 50 minutes on no topic at all, but I can't talk for 3 minutes on anything worthwhile, so let's go with validation.

How would we recommend that the Agency set up it's guidelines for accepting a PAT in place of or in lieu of or as a method of superseding a current method for approving a process or a drug product? We have seemed to have focused on oral solid dosage forms, although my good friend David, who has disappeared, talked about suspensions before and I think we have to remember that we are talking about any kind of dosage form, but it seems pretty apparent that oral solid dosage forms get the most interest. Maybe because there's more of them and maybe because there are some opportunities there unmet before. Anybody would like to comment on how we validate? Somebody's going to comment, thank goodness.

DR. MILLER: I'll just start a discussion--just kind of start with a question here, is it a reasonable thing to take as a starting point, what's currently done for validating laboratory analytical methods and see what needs to be done to those in order to make them applicable to a process system?

DR. CIURCZAK: Two people--when we were at the--I think it was last month--at the Advisory Committee meeting and they made the concept about sensors versus analysis. Rather than thinking within--and I hate this term because every commercial in the world uses it--with that, instead of thinking inside a box, we're so used to being analytical chemists, where we have to come up with a number, 98.75730201 and round that down one, instead of just saying good, bad, or indifferent.

If we're going to set up a set of sensors throughout a process, we may not need to know an exact answer, that any one of those--this is for the USP concept--I hate going back to that. But if you look at something like lactose, you boil it up with copper oxide and if it turns red, you've got a reducing sugar, you put it in ammonia and you look at the optical rotation. No one of these things is definitive, they're all circumstantial. But it all adds up to a quality or an ID for a product.

The same thing when I used to be talking about near infra red as a final release and everybody was saying, well, it's a single test, you have trouble with things like specificity. And I'm saying, no, it isn't. The trail of evidence under FDA guidance is stricter than anything the FBI ever had. From the minute that we quarantine raw materials and start doing tests and there's labels and they're quarantined and they're shipped with paperwork and signatures, and even to the point of two people signing off the weighing of them into the blenders and it's validated this and validated that, by the time you get to, say--and I use NIR because I make a living at it--using NIR for final release it's the last in maybe 25 or 30 tests. You know what's in there. You have a pretty darn good idea from the batch record how much it was, that everything along the way was there. So, I think what I'm trying to say is that any of these tests and to answer Howard in a long way--need not necessarily look at all 29 for 500 of the ICH guidelines specificity linearity, et cetera, et cetera--we might have to be able to just bring it down to a certain amount.

If we're looking at pH for a flowing system, you know, all we have to do is show that it's linear between 4 and 7 or whatever, because you can get carried away with all the rules and guidelines. As I said, God only gave us 10 SOPs and look at the size of the regulatory committee that we have now between the mullahs, and the priests and the rabbis.

The, you know, we don't want to overdo it, the KISS, I think should apply here, to keep it simple stupid. We use a lot of inferences, I think, would be a good way along here.

Weighing the tablet--if you've shown everything is perfect and the blend is perfect and you've got a validated tableting process, you should be able to weigh it. You know, something as simple as that. We would tend to think of fancy spectra and chemometrics, but how about weight, or hardness, or color or something like this.

That's all I wanted to put in and we don't need to, necessarily, in my opinion, go to extremes for every single one of these tests that we put on line. Just as long as it does what we say it does.

MR. COOLEY: Art, kind of building on, I think what Howard was saying, was, you know, it's good to start at some point and the ICH guidelines may be a place to start. But I think the issue similar to what Emil's driving at, too, that there are some applications where that makes sense and there's going to be some applications where that doesn't.

Looking through the minutes of the previous validation meeting, it appeared that there was an attempt to kind of pigeonhole PAT in one box and it was even referred to as inferential measurements, I think. I'd like to throw out an idea to, maybe, think about this in a little bit different light. And that is I don't think of PAT as necessarily an inferential measurement. It can be just as specific as any laboratory test, but it could be just as inferential as a pressure transmitter.

So if you could kind of look at on the extreme left, having inferential measurements, like, pressure, temperature, flow, volume--things like that that we typically use to control our processes. And on the extreme right, laboratory methods where we do need all of the specificity and so on because they release assays. And think of PAT as kind of a bridge between those two, where to the extreme right you have PAT methods that may be every bit as accurate and specific and precise as a laboratory method and in that case, you could certainly use them in place of laboratory release methods and they would need to be validated to that level.

But on the far left, you may have things that were an online analytical measurement may appear to be more in the realm of a pH or--I'm sorry, of a pressure transmitter and you would certainly validate it in that way, if that's the way it's being used.

DR. KIBBE: I think you made a couple of good points and I want to make one other one. We keep talking PAT, but that, in my mind, is a group of technologies and they're not all the same. And our colleague over there is doing near infrared and, you know. I know how to do a blend, when I'm adding one pink ingredient. I wait until the color's uniform when I see it and I don't need any fancy equipment. I can look at it and it's all a uniform color. That's how I do my paint when I paint my walls and ceilings, right? Oh, yeah, we paint them.

But in any event, I think you're right. I think what we are faced with is--depending on the technology that we ware using as an in-process tool to clear our batches or monitor our process, we have to have a different validation. And it could very well be that blend uniformity, as determined by near infrared or some other probe in our blend, can only truly be validated if we get to what we think is an end point at blend uniformity and that blend results in a truly uniform batch of tablets.

And would that be good enough for the Agency? It might be good enough for me, but would it be good enough for you and if that's the case, we can have real simple validations for some things and others more complex.

MR. COOLEY: To comment on that. You mentioned validation that's tied to a certain technology. I would propose that the validation be tied to its intended use and not the actual technology.

DR. MARK: Yeah, that's essentially what I was going to say. It seems to be at least as much the application because it might me--in some cases, I was thinking you might want to make a simple test with controlling the process. Say a company learned that if they controlled the--you know, some parameter, I'm not even going to try to pick out a specific ones, some controlling some parameter controls the process adequately for their needs. But that wouldn't be enough to satisfy the regulatory requirements. When they got to the end, they'd still have to do a separate set of regulatory validation measurements for regulatory purposes, but the simple PAT test would be enough to keep the process in control.

On the other hand, they might have a whole suite of tests in the PAT and that would simultaneously satisfy the regulatory requirements. So, you know, there's a whole range of possibilities of how it could be applied as well as to possible technology and that that would determine how much validation was needed.

DR. KIBBE: Let me turf a little bit of what you said to some of the people at the Agency. Isn't our intent here to develop ways of replacing the standard testing with process-testing tools and if, in fact, that tool is predictive of the outcome, isn't that the direction we want to end up going?

MR. FAMULARE: Yes.

DR. KIBBE: I love, true/false, questions. You have to push the button so we can hear you.

DR. WOLD: So, I think that we have to sort out two things. One is what PAT is used for and the other one is process control. Because if we start to mix in process control and if that has to be validated, too, I think that FDA's role will expand greatly. That was not your meaning. So it is very dangerous to have this control within the purpose of PAT. PAT's purpose as I understood it is precisely what you said to be used instead of other traditional chemical testing.

Traditional chemical testing is not used for process control. PAT can, if you want, be used as process control, too, but that is not--

MR. FAMULARE: That's already an expectation for process control even under the current paradigm because you wouldn't be able to achieve validation without control.

DR. WOLD: Yeah, but--

MR. FAMULARE: The difference between today's paradigm and the hoped-for paradigm with PAT is that you'll have more data. We'd hope with more data you'd be able to better control the processes for a more positive outcome as opposed to, I think, they way of thinking as you expressed it, that FDA is looking exercise more control.

We're looking for you to exercise more control--

DR. WOLD: Yes, for sure.

MR. FAMULARE: --so that we could step back from these actually indirect ways of looking at things from just limited data sets.

DR. WOLD: But that's the way you use the data from process control--under process control, does it keep the process at the right temperature, right speed, whatever, that is to say, together with all other processes that they are in a certain range.

So, for sure, if you want you can use control data for also PAT, to ensure that your product is okay. But I think that it's very unfortunate and very confusing if we start to mix. Because, let me make a direct question to you. If you--if somebody comes in and say I can't have a better thermocouple to control the temperature in the inlet eye of a drier, does FDA have anything to do with that? Or if you say, no this measures the temperature and this is fine. I don't think FDA meddles with how people control the process from a technical engineering point of view, do you?

MR. FAMULARE: That is, that can be a GMP issue as to--in terms of a root cause as to why, you know a processes does or does not work.

DR. WOLD: Yeah, sure.

MR. FAMULARE: Whether it be a thermocouple in a heat for a drier in sterile processing, it's critical in terms of monitoring autoclave temperatures, et cetera--

DR. WOLD: Yes, but--

MR. FAMULARE: --so I don't know, I'm not quite clear how you are segregating the, you know, qualified equipment is important so--

DR. WOLD: --my question is, do you--

MR. FAMULARE: --it has nothing to do with PAT, it's just--

DR. WOLD: --yeah, but the problem is, I see, we are discussing two things. We are discussing PAT to substitute testing, as you said, and that's one straightforward application and we can eliminate a lot of traditional testing and put PAT there, instead, because it measures basically the same things, but in a better way and perhaps, indirectly we are lots of signals, but it's basically the same chemistry we're looking at.

But then comes the second thing is, of course, once we start to do that we can then use that also to detect upsets or out of specifications or what do you call it. And then, we my have to do something. And that is the process control.

And then, if you have an operator doing things, you call that open loop. If you take the PAT equipment and actually wire it so that it will, itself, correct the process, then you have to do a lot of identification and process control modeling and so forth before you can do that, but you can do that, too. But I think that's far beyond what we are discussing, because it becomes much, much more complicated and it was not the original intention.

I can see that this discussion gets out of hand, so let me back off and say that, if we now go back to what I consider a traditional or accepted objective. For PAT to be that in a certain way, you have to have the same requirements in that as any other testing.

The problem with PAT is that because you have much more signals, usually, it's more difficult to keep track of all things that happen, so you have to have more--a more elaborate strategy to find--to change the conditions of the process. Much too, too high concentration of active ingredients and too low and too much excipients and too little excipients and too much blending and too little blending. All of these things together, and I think one should follow design, otherwise, you can never do validation. So, that was what I was trying to say.

MR. FAMULARE: Well, I think, maybe, that's--part of what you're saying towards the end there is probably an issue for the training group in terms of you're going to be looking at a different data set as FDA, you're going to be looking at a different data set as manufacturers and we have to learn how to deal with that rationally, reasonably, and scientifically. And I would agree with that.

But in terms of, you know, the stated purpose as Arthur has expressed it, yes, you know, it can eliminate the need for conventional testing. You have out of specs, as situations now with the current paradigm. Our hope from a positive aspect is that this will either, number one, prevent all those out of spec or recall or other manufacturing situations that limited data can address; and, secondly, to, you know, to be able to, maybe--if it is legitimately out of spec, be able to pinpoint the problem better as opposed to having it, you know, an indeterminate, with no other alternative than to dispose of the whole batch. So we're trying to look at it from those positive aspects.

DR. KIBBE: Let me get us back a little bit on--I assume you're going to go back to validation--

DR. MORRIS [?]: I think we're mixing--we're getting confused because we're trying to look at too many things all at once. We've really got four things we need to look at here. I think we need to look at whether this technology is controlling the process, number one, or whether it's monitoring the process, number two. And those are--while they've got many similarities, they have some very important differences.

Number three is it a direct measurement, or is it, number 4, is it an indirect measurement? An example of a direct measurement would be, let's say an ERI analysis of an active ingredient and a tablet. An indirect measurement might be something like a hardness or something related to dissolution; it might be a temperature measurement, it might be a blender rotation speed, it might be--it might be all kinds of things.

So I think we have to keep these things, at least for a while, until we can clarify our thinking in separate boxes.

MR. FAMULARE: Right, and when we talked, about, I'm sorry--

MR. LEIPER: Thank you. I've listened for a while now and, with all due respect, I think that we're probably looking down the telescope from the wrong end because if the answer lay in what we did today, we would sure as hell know how to do it and we don't.

And the thing that's lacking and it's been going around this room all today and it went round the room in the Holiday Inn for two days four months ago, is that we've got to understand the need. And the need is driven by our processes. We've got to understand our processes so we can't accurately talk about validation of process analytical technology until we get into our minds that these processes we don't know, actually, how they work.

And one of the problems that we've got and had over the past is that we actually use univariate measurements inferentially to describe multivariate dynamic systems. Now, if we're going to get anywhere with this, we've got to understand that multivariate nature.

Point number one, about validation: when you validate a technology that's capable of a multivariate assessment and you use an inferential univariate measurement, you just might have an awful lot of trouble on your plate. And the blend uniformity working group is a very good example of that. You know, we--it's taken us two years to find out that we're really no further forward than we were two years ago because we were looking for a quick fix rather than something that actually took us far, far closer to where we want to be.

So the first thing is that we've got to understand the processes. Now that's not just unique to manufacturing processes. We've got to understand our analytical processes. Now, if you think about our understanding of analytical processes and go back to the blend uniformity working group, there's one thing for sure: With equipment qualification, we know that the qualification equipment's okay, we know with C.F.R. 2111 that the data management's okay. But if we do a risk analysis of an analytical measurement and take it from the sample preparation, the measurement, the data acquisition and reduction and the production of the result and we look down the right-hand column and say where's our maximum risk? The maximum risk is sampling the process, so it doesn't matter how much effort we put into equipment qualification and C.F.R. 2111, if we don't get the first bit right, we've actually got big, big trouble.

So, you know, we can't just launch into this about pH measurements and all that kind of thing. We've actually got to understand these processes and take a step forward and say what types of measurements are going to allow us to facilitate that.

Now, wouldn't it be good if these measurements, these multidimensional measurements not only facilitated process understanding and development but, also, facilitated control and manufacture?

Because, if you look in your--if you look in your backgrounder and you go to Ray Sasher's [ph] presentation and this was done on behalf of CAMP, you will find that the industry has got low utilization of manufacturing processes, 30 to 40 percent on average. And that's probably on a good day. And we get, on the next page that a 1 percent yield improvement--now bearing in mind, we've only got 30 to 40 percent efficiency in the utilization--a 1 percent yield improvement would yield probably very conservatively $400 million in savings across 16 companies per annum. You know this is what--this is what we're gunning for and the beneficiary is the public. So, it's got to be process understanding. It's got to be the right methodology, I believe and the same principles for looking at validation or the structure of validation in processes, it doesn't matter whether it's the manufacturing process or it's an analytical process, it's exactly the same. It's understanding the risks, it's managing these risks and having done all that the validation is actually proving that you've managed the risks in the way that you have described them in that process.

So, I think we've got to get something far more fundamental than we've been looking at in the past or, indeed, today.

DR. KIBBE: Okay. So, you're going to have to help me, okay? So, I'm all excited, I can't wait.

MR. LEIPER: You're all excited, Art.

DR. KIBBE: I can't wait--I cannot wait.

MR. LEIPER: Watch your pacemaker.

Dr. KIBBE: Right, my little pacemaker's going, you know, pitty-pat here. We, I think, intuitively all understand that whenever we make something, there's a processes and if we want to make exactly the same thing each time, we follow exactly the same steps and we should come up with the same result. And if we don't, then we might not end up with the result. And so, if we can find a way of keeping track of all of our steps, at least the critical ones, then the outcome will be fine and I don't have to do terminal testing, right?

So, now we're looking at process analytical tools or assessment tools to be able help us do that and what we want to know is what kind of a guideline can the Agency develop that will help industry feel comfortable that what they do to validate any tool is going to help them know that the tool is working well?

MR. LEIPER [?]: I think it's quite straight forward. It's the same as in any other industry. You actually--you understand your processes, you identify the critical areas, you categorize the risks and you manage these risks. And some of them you manage in terms of, with a PAT. I mean, some of them, this morning, when we were talking about an SOP to get ingredients in a blender and in the right order. A bar code reader's only $300 or something like that and we can actually make sure it goes in in the right order. We don't have to have bits of paper that we would sign to say that these kind of things happen. There are very interesting technologies that are used in your supermarket that will actually do that for you. You know, and we've just got to think differently, we've got to think out of the box.

MR. HALE: I think that--I agree with all of that and it gets back to a design issue of thinking about not using sensors, but thinking about designing what you're doing and a lot of it falls out.

Another think that hasn't been talked a lot that is an issue in these are specifications. Because we define specifications very early and we can, therefore, tie our hands based on the way specifications are written, the methodologies that go into specifications, so that the freedom to optimize or to measure to improve or however that's defined, is controlled before a lot of other things happen, like scale-up and manufacturing and so on. So I think that we--one effort that could help us define how to do the specifics of validation could be looked at as a function of how we write our specifications or how we do--or in another way, how we do the release of either the product or unit operation.

And I think it could be defined somewhat along the lines that was earlier talked about into three different categories.

One would be the traditional way that we do this, where we take samples after a process is done and the process is defined within strict or strict or not strict, but within parameters that are static. And that the testing of either the unit operation or the product is done in a physical chemical sense in a laboratory away from the process.

The other one would be a process that is controlled and the product quality is inferred from the data on the process.

And the third way is that if the product itself is actually measured, and that the process is controlled to allow product quality. And if you look at blending, you can take--in those examples, as a unit operation, you can take these samples based on rotating a blender a fixed amount of time, based on development data, one would presume, and take a sample and test it off line. You could measure the processes a number of times, or you could actually have a probe that measures the uniformity somehow in there and that the validation would be defined differently for each one of those cases.

DR. KIBBE: It's my impression that often when the industry looks to the Agency for a guideline, they want us to tell them that you take these number of samples now and you do this and you do that and you that and that's validation. And I think, what we need to tell them is the general rules and let them establish it and I wonder how many of the people who are industry people out there are comfortable with that? Know that the way they interpret the rules is then going to be further interpreted by Agency people?

MR. LEIPER: You know, I think it's quite clear that, you know, that there have been claims over all these meetings that we ought to be able to scientifically justify what we do. And I think that it's incumbent on the Agency that it's actually got scientifically review the information that's provided. And, you know, I think that these are pretty big burdens that we're going to place on all sorts of people, but traditionally, what the industry has been looking for is an--when they ask for guidance, they want an instruction. And the instruction is that if we do this and the FDA come in a look at, then it'll be okay. And it doesn't matter what the hell happens to processes because we can live with that in 40 percent efficiencies. I mean, that's the indication.

You know, so, we've--it's breaking--it's actually breaking that mold. And I think that a lot of that was done when we went to the equipment qualification. It's fascinating, we wrote GMP, we then got into that in the '70s, the '60s and '70s. We wrote a validation--guidance and validation in the '80s. And in the '90s, the early '90s, '91, I think it was, we wrote equipment qualification. And then in '93, we came up with something and wrote the specification results. Now, you know, logistically, it's all in the wrong order.

Equipment qualification, however bad it was had to happen first. You know, because you can't do anything unless you know that the equipment is actually working in some sort of way. And then you can write--you can begin to write approaches to GMP and then you might be able to write something about validation. But, over all that period of time you were dealing without the specification results, not too well, I may add, but we were dealing with it.

And this is an opportunity to put these things into perspective. And I think that the model that you've got for equipment qualification is actually a good model to follow because it starts with design qualification. If you don't know what you're trying to do then you'll never make it.

You then go to installation; you go to operational and performance qualification and performance qualification, to all intents and purposes, is interactive validation. Revalidation. If you've got that right, that's what happens.

The thing about that whole system is that it's always referred to as the 4-Qs approach to validation, but it's not. It's really the 5-Qs approach to validation. And the fifth Q stands for rescue and that's what happens when the DQ has been done badly. And it's all--all this is front-end loaded.

DR. MARK: Okay, I'm not going to say Ken is wrong because he's right--but--

MR. LEIPER: Was that a validation statement?

DR. MARK: What?

MR. LEIPER: Was that a validation statement?

DR. MARK: I think so, I'm validating what he said, but the problem is--as I see it is that what Ken's talking about is a very long-term thing, I mean, years and years of research to, you know, to do enough work on a process to understand it thoroughly--

MR. LEIPER: And the confusion that we've hot, Howard, is that we've got years and years of mumbo jumbo. And if we could get the mumbo jumbo out of the way, it wouldn't take years and years and years of research.

DR. MARK: Now, that may be, I don't know. For better or worse, I've never worked in the pharmaceutical industry directly, so I couldn't speak to it. But it sounds like you're talking about doing the whole process development, which is certainly something that's necessary, but I think not what this group is supposed to deal with. I mean, we're talking about process analysis which to my mind, you know, does mean a number, even though, of course, I understand there are important things like blend of--blend uniformity, which aren't, you know, a concentration per se you want to measure.

DR. KIBBE: If you don't know what the process is, how are you going to measure it? And how are you going to track it? And if we're going to do process assessment tools, I like my word better than analytical, then--then we have to know what process we're assessing. We do that in education all the time. We think we're educating our students and we assess how well they've been educated and we find out we can't do anything with them.

But what I'd like to do is get some other people to comment. Jerry, you have something, you want to jump in here?

DR. WORKMAN: Yeah, I've been a little bit confused about the overall issue of validation because when I look at what you're looking at, you're looking at sensor and software validation, you're looking at sensor, the calibration and validation, which involves with multivariate problems a lot different problem than univariate. Then the process monitoring validation, if you're going to monitor, what are the protocols and how is that validated? If you're going to model the process, using that information, how are you going to proceed with that to get a good model. And then, also, the controls. If you're doing process control, what are those protocols and how are those validated.

Is the method a primary method or a secondary method? If it's a secondary method, you need a primary method, so you have to validate that before you do the secondary method.

Is it, are you looking at a direct analyte [ph], an active, for example, or an indirect analyte, like dissolution or are you looking at a virtual analyte, like, how much the customers love this when they take it. Those things are possible, as well.

So in all of this arenas or eras, if you will, there has to be specific validation issues that are addressed. And they're somewhat, you know, they're somewhat separate in how you would address those. I know, for example, if you're looking at multivariate calibration, it took a group of--in ASTM--it took a group of, well, anywhere from 40 to 100 people 8 years to put together a protocol on how to--in a continuous process do multivariate calibration for infrared and near infrared and how to do the outlayer detection, how to do the monitoring, how to tie that in to closed-loop control and get that many people who were doing that type of work to agree on it, how to do it.

So, there's a lot of specific issues, I'm not sure which one is being addressed. If anyone can help me.

MR. LEIPER: I understand exactly where you're coming from. The point about it is that if you start off at a low level, you'll forget what you were actually trying to achieve. The most important thing is to keep in mind what you're trying to achieve and you can mark down that and you can refine it as you go along, Jerry, I think that's important.

And I think the other thing that's important is that the methodology--the assessment methodology is inextricably linked to that process that you're looking at, you know. And that's somewhere that we've never actually been before. Because analysis has always been carried out in isolation to the process and processes have been designed in isolation of the analysis. And I think this is where, you know, where the points that Tom's been making all day and at the last meeting, it's important that we actually design--that we actually think about these processes.

It's also important that we--that when we begin to look at this, is that we make--we actually design processes that are measurable. We don't set ourselves Mission Impossible because someone designs a process and no one's got a cat's chance in hell of coming up with a measurement system for it.

You know there's an awful lot of things have got to go into this, but I think--and I think that we come down to the issues that you describe. I mean, for instance, blend uniformity. We know we can do blend uniformity by and end-point-type methodology, that would be a methodology that we would use. The problem that we've got is that the whole sampling regime for blends is discredited because we know we can't sample them. You know, so how we will use that. Is that the reference methodology for the validation?

It actually looks at the distribution of the active and it assumes--it assumes that the excipients are, indeed, the most important things. Art was talking about this morning and the max stearate is distributed because the active's distributed, rubbish. We know that that is not so. So we've got to put our existing methodology, our existing approach to these correlations--we've got to put it under as many challenges that are justifiable as the new methodology that we're putting in because the problem that you've got with a new method and cross-validating it as an old method is that you could actually be detuning the method--the new method to actually meet the conformance of the method that you know is not doing you any good.

DR. KIBBE: What we've agreed, I think, is that we can't always use existing methodology to validate what we want to put in place; that we have to have validation protocols written for a method and a process by the company that's using the method and process. And then we have to have some criteria that the Agency can use to say they've written a good validation in their situation.

And then, Jerry's list, which I thought was quite complete is the guideline list for the Agency to say, okay, these are the questions that need to be addressed in any validation. How many of them can be ignored in this process because they don't apply? And how many of them should have been looked at because they do apply? And did the company look at them? Am I getting close to where we are? What do you think, Tom.

MR. HALE: I think that's right. I was just sitting thinking that we have--we have a regulatory and I'm not sure this makes sense at all, but I'll say it anyway. We have a model that we use for filings in the developmental pharmaceutics section of how we got to an endpoint in terms of the product. And I know, I've done this before, but what is--the history of development of these processes might be a way of getting to a validation that there is--that could be a disjointed redevelopment process at each scale or there could be this inherently scalable processes and product. And that might be an important aspect of what's required to proceed further in validation.

MR. CHIBWE: Yeah, I think that's probably the best way to proceed. Because if you go back, we seem to be going into Phase III, when I believe that PAT is in Phase II. So, when we're jumping to process validation, we're actually trying to go into Phase III for continuous production. If we have the safe harbor, and if it's going to be as protected as we say it's going to be, then the development work itself, should provide the validation that is needed.

In other words, it's going to have the traditional limits, specificity, ruggedness, linearity because that needs to be specified. Because you simply can't measure something and come up with some statistical analysis and just claim this is what I have. You will have, definitely, some reference to a traditional method during your development. And that's when the validation's going to take place.

And if you're going to take everything back into Phase II, I think that's where we should, our discussion should focus for now. And later on when we have developed it to a point where we're going to go into continuous production, I think that's when we'll probably encompass the entire processes validation.

Because, otherwise, at this point, I think most companies, at this point, would try to use set-in sensors for set-in parts of their process.

MR. CHISHOLM: I finally managed to steal my mike from Ken for a minute, you know. I think we have to get a little bit careful. We're getting a bit esoteric at times here, I think. And I think if it goes too esoteric, it can become meaningless.

We have two different scenarios to deal with. We have products which will probably be in late development. We have products which are out there already. And we have products which we're developing. And I think what you were talking about as going right back as far as Phase II is we have every opportunity in the world to design quality into the actual product and, therefore, it's manufacturing process. So that has a different set of validation criteria, I think, from those currently in late-stage of development where I would suggest maybe a lot of companies will be wanting to submit these and products that we already have that are fairly young and would be worthwhile submitting. It's very unlikely we'll submit old products anyway.

So I think there have got to be different validation criteria for that. Now the only way that I can see us actually dealing with products in late-stage developments and products already in manufacture is by demonstrating equivalents to existing registered methods. I cannot see any other way because you have not the chance to get the design process right, everybody keeps talking about. So I think there's two classes of problem here when it comes to validation and I think we need to deal with them both separately.

DR. WORKMAN: Yeah, there's been a lot of discussions on--over many different organizations and groups about how to describe the whole calibration/validation process--whether you want to specify exact details in a cookbook fashion or whether you want to treat the method as a black box, where you have--where you thoroughly describe the design of an experiment that goes into the black box, and then thoroughly describe how you validate whether or not what you did in that black box is working.

And, of course, you would document everything that was done there. But most of these complex multivariate methods, in my opinion, can be addressed by the input and output issue so that you don't have to completely describe every mathematical process that goes on.

Once the method results are obtained and that information is provided, then what you do with that information is the same thing that you would do with standard analytical information if you had it in a real-time basis. That's one way to address it--one model.

DR. TIMMERMANS: I just wanted to make a couple of points. I think, in most cases, we will have an opportunity, if we have--if we implement a process analytical technology-based measurement to go back and compare it to an existing analytical methodology. In some cases, though, I foresee that we may not. And we may actually make an inferential call based on a result that we obtain on a product further down the line. So I think that that's something that one should, you know, should keep in mind.

Also, while I agree with Ken, you know, that ultimately a fundamental understanding of our processes is key, I agree with Howard's assessment that that's, you know, something that will probably take a while to get to because, in some cases, we actually, you know, we just lack the fundamental understanding of, for example, solids flow, to be able to really understand the blending process. So I think that that should be noted.

My approach--my personal approach and I think a lot of people here, I hear the same thing, has been, you know, to use scientific rationale when you validate your methods. And, you know, to go back to one of Rick's points that he made very early on in this whole discussion is, you know, applicability of the methodology, you know, it can range from something very simple to something, you know, very complex. I addition, you know, we're--essentially we're measuring--we're trying to address a multidimensional space if you will, with this validation discussion and I think there are many components, most of which Jerry brought up. Each of which have their own issues and that may need to be addressed, but I think the, you know, the scientific rationale should be at the

fundamental--at the basis of the whole discussion, so--

DR. ANDERSON: Just to amplify your comments. Right now, and I know you have had experience with this, as well. If we do good science, we can bring that and we could submit the method and we can be doing PAT tomorrow. In fact, that's literally my plan, but it's my understanding of all of us sitting here that we want to make it easier for companies that aren't willing to step out to the front and say, I'm going to do this because I've done the science and I'm going to hope that there's reason in the FDA and things go well.

What we need to have is a tool for us--for me, as an industry person and for you all as people who are evaluating my science, a way for us to connect and for you to easily judge, or at least a framework to judge my science with your investigators. What do we list and what do we put in that framework--what does that framework look like?

DR. KIBBE: And I think that's what I was trying to get at a little while ago when I said we had to take some of your list of things and then let the scientist whose ready to move say a bunch of these items don't apply to this particular process. These items apply to this process, I've done these things and I ruled out the fact that my result is a function of some variable that isn't under control, that isn't part of the process, it doesn't control--I've ruled those out because I've looked at those and now my process is under control and this is telling me this and this is what I'm going to follow. And I think we have colleagues with would rather have us say, measure these six things, measure two things. And I don't think we're going to get there and I don't know if anybody thinks we're going to get there.

We have opportunities for guidelines that apply to everything and we have opportunities for multiple guidelines to apply to different kinds of things. And should the Agency be in the business of, one, overreaching guideline for validation of PAT or should it be writing 20 or 30 guidelines--one for how to handle active ingredient arrival, one for how to have blend and so on--and I think that's another way of looking at it. I don't think the Agency want's to write 27 guidelines, but they also don't want to be in the business of arguing a guideline with a person who thought he lived up to it when they didn't, either. I mean, that's one of the problems and you have a good one, go.

MR. LEIPER: Well, I--you know, I think that there's been some interesting stuff has happened and what you're referring to anyway, Art, and that is that I don't think the industry wants a compendial approach to this at all because all our processes are actually different processes and they're processes in their own right. They've got commonalities, but they are different processes. And it is interesting to see the approach that the FDA took at the USP meeting on functionality in December, where they said they recognize that the functionality of materials in solid-dosage forms is fundamentally important, but it is process-specific, it's not something that's a compendial--a compendial issue. Which puts the onus back on the people who are responsible for the processes, i.e., the industry, to actually, scientifically investigate and defend the stance that they've taken--that they're taking. But I--and I think that the good thing about these meetings that we're having is that it's bringing the industry and the regulators together because the industry's got the processes and the regulators don't. And it's that--it's establishing these linkages in a non-threatening environment, may I say, that's actually important and will take us forward.

DR. WOLD: Now, I think that those who say that PAT can be validated in the same way as any other equipment or whatever, in principle, are correct. But if we go back now, see what is specific with validating analytical technology. I mean the first thing is that any analytical technology is put interest the process or after the process to measure certain or to deal with a certain problem. If you want--if you say I want to make sure that I don't have too much or too little of active ingredient, then you develop an analytical procedure for that and then you validate that by first of all saying that, if I have too much or if I have too little, it really shows that I have.

Then you have the second problem that each analytical method is reacting to other things, to disturbances and the interactions and so forth. Now, you have to make sure that the normal disturbances you have--in my process I have excipient that vary a little and I have temperature and I have humidity that these don't disturb my measurements too much. So you have to vary these and show that your measurement behaves okay.

Now, the real problem comes after that, I think. And that is in the real process there will be a number of new disturbances, that we haven't thought about, indeed, we haven't understood. And process analytical technology, based on spectroscopy any other multidimensional sensors, they are more sensitive to the whole world of new disturbances, which is a very good thing because we see them. But that is also problematic because we don't know how to deal with this new information and I think this is what we are, so saying, having great difficulties with.

The first two, to have evaluated as any other univariate or few-variate method we can deal with in a very straightforward way. But to say that optimistically, now processes analytical technology will solve all future problems. Then we have in the validation in some way to incorporate, also, all future problems and that is a very great difficulty. And we have to go piece-wise. And I don't know if FDA is willing to go piece-wise and say, now we have this operating as well as traditional methodology and in five years we shall see from real production how well it actually caught unknown disturbances that we haven't seen.

DR. KIBBE: Let me just see if I've gotten some of what you said and put it in my own parlance. If we put in a new took, which is naturally more sensitive than the old tool, then it will find variation that wasn't there before, just because its sensitivity is up. We pertubate the system to make sure it actually can notice changes that we make in it so that we know it actually is going to measure changes and not ignore them. and then we decide at what point we're happy with the variation it sees as being within limits. In other words, we set our limits of its variation to match up with what we've already got. All right?

Then we stop doing the second thing or the original test and we now depend on this new system, but we don't know, five years from now, whether it will miss a change that the old system wouldn't have missed. Is that part of our concern?

DR. WOLD: No, it will see all the things that the old system saw but we know that the old system we have today, is not adequate. Any system we put in is inadequate for everything that happens in the future. So, we want to simulate in some way the real variation in the production, including what we don't understand and this goes back, now, to Ken, who says we don't understand our process. We will never understand our process fully. That's impossible, because it's more complex than our brain.

MR. FAMULARE: It almost sounds, though, you know, under the current paradigm you do the process development work, you have your standard analytical tests, you feel comfortable with the process, you represent this as your specifications and you validate against them and the process goes along for five years and you find something, you deal with it. And, you know, you may have to investigate what caused that change. Now, the way you're describing it, you'll--and I may be getting it wrong--you'll put in a PAT process, it's more sensitive but, hopefully, we've factored in the sensitivity against the specifications so that they're statistically and scientifically rational. But then it sounds like you still want a five-year, 50,000-mile guarantee on it and I guess it would be a similar parallel to, you know, any unknown that might come up in the existing paradigms--in excipient changes or something happens. I don't know how we could satisfy that concern that you're raising in that format and how PAT is making you any worse for the wear.

DR. WOLD: If I may clarify a little. I mean if we take, say, near infrared spectroscopy, we know that we cannot see the differences between different vendors of excipients. Now we are not quite sure if it really matters, if this difference matters. But we suspect that it may matter sometimes and we--with multivariate sensor techniques we can see much more. That means that today we know from a scientific point of view that actually the old way of writing specifications we're just saying we need content uniformity and we need this and we need that that is inadequate. And we start to see that already and we start to have a lot of process problems, the list of your directors was very revealing in that way.

So the process analytical technology brings hope we can see more, we can be more realistic. But the question is, we can validate and say we do the same lousy job as our present measurements do, but that is not really what we want. We want to do better. And the question is, how do we validate that when we are not quite sure what better is? But maybe I'm too academic, I don't know.

DR. WORKMAN: Well, at the risk of over--I'd like to get back to that but I was--but at the risk of over-simplifying, I think the validation procedure should include a rationalization for what information's needed, where it needs to be measured, when it needs to be measured, how the information is used because if you put enough sensors on the information or on your process, it's sort of like, I think, raising teenage kids, you don't want to know everything they're doing, otherwise you'd be changing their lives an awful lot more than you should probably. If you know everything about the process how do you deal with all this information? And then who interprets it and do you throw out the bad stuff and keep the good things to make it look good or, I mean, there needs to be protocols, I think in all those areas, but a good scientific rationalization for each processes.

DR. CIURCZAK: One of the things I was thinking of is we seem to be either or. One of the reasons you might want to slap a dozen or two sensors on a system is, literally, for information purposes. And if you watch it over a course of a year and you notice that when the moisture goes up, you have a higher reject or you have tablets don't dissolve, you now know you can control the moisture. Now you can take that measurement tool and use it for a control tool.

The same thing with anything else. If the hardness doesn't seem to matter--if you go from 2 to 20 and your release rate's the same and everything else, you can can hardness. I think that, again, we can't a priori know what is an important factor because as, Ken, who is one of the few people that I found in the room when I came into pharmaceutical NIR, many years ago. I thought I was alone, then I heard this fellow, but you were wonderful in "Jurassic Park," by the way.

And, but as Ken says, and he probably predates virtually everybody in this room in terms of looking at something like near infrared and pharmaceuticals, that we don't know. We measure, we hope, we guess, we do a Carl Fisher and hope that the chemicals don't react with anything in there and we assume that we're doing a lot of things. But if we use the PAT as a monitoring tool to begin with and then start filtering it--and, right, we are sensitive, we may see things we haven't seen before. There may have been changes we had--we couldn't detect before. And we'll see this and say, hey, it's subtle, but when this changes our product goes good, bad, or indifferent.

So before we worry about validating them as a control, let's see if we can get the information--because there's a difference between data and information. I once went to a place, and I noticed that they were doing the room temperature and relative humidity in every room and they had two people in the company doing nothing but changing these things. And I said, what do you do with this? And, basically, they stored it. They never changed anything due to it. They never tried to get dehumidifiers--I said, that's a waste of time, I said, it's numbers, it doesn't mean anything.

We may find that out--we may find out we are almost doing as much as we need to do right now, a moisture on a granulation and a content uniformity--you know, we automate those things and we might wind up with excellent procedures. We won't know until we actually try some measurements along the way and, again, up front, you don't know what's necessary and what's efficient. As I used to tell the kids, you know, put the number down from the bottles, everything that's on the label, copy down. If it turns out it's not important, you just filled up some pages. If it was important and you don't have it, we'll never know where we went wrong.

But from this, we can then start design of experiment. If we can hold everything within range and vary one thing at a time, now we can do a very controlled scientific experiment and understand what's important. And we may wind up throwing a lot out and say these have absolutely no control, these are the three things we need to monitor and we have process control. We can't go up front and say let's just take everything in at once and validate it.

MR. LEIPER: I think the point that Joe made was a good one, we've actually lived in this area for an awful long time that things have been moving on, et cetera. and this is going to be no different, but our ship anchor is actually the specification that it would be tested to in the marketplace and our stability data. Because, you know, we've got--we're not going to stop stability testing at all. You know, so we're bracketing--we're bracketing this, anyway, so as it's moving along--I think that, you know, there's a lot of good reference data that we're generating.

But I think that the difference is, it's as Emil said, and I think someone else said--it's--the testing that we do just now is just data because it doesn't necessarily correlate with our processes. It's only if that data has got that information content that holds process data that we can actually do the kinds of things that Jerry was talking about.

And then, when you move on from that, as we build that upper--not just within processes, but across processes, we begin to build up knowledge bases of approaches to formulation to work and tend to be reliable and we can--we can begin to become far more efficient at taking these things forward, so it's about data, it's about information, it's about knowledge.

And the last thing that we need is just that little bit at the top of this triangle, it's called wisdom. And that's to use it appropriately. And that requires pragmatism that I think Art refers to. Most of the time I've had his acquaintance and he's been guiding us in these--it's a wisdom to use that properly and not just get bogged down with where we are today and the problems that we might have in the future. We're going to have problems in the future, but they're not going to be as big as some of the problems that we're facing today.

MR. CHISHOLM: Talking about dates and information, I was in Dublin about four weeks ago and they built a brand-new car park--now all the big neon signs all computer controlled, and that sign said nearly full. Now that's a completely useless bit of information, when you think about it, isn't it, for a driver? The number of spaces that's left is useful information, but nearly full, that's pretty ridiculous, really.

And I think there's an awful lot about what we do and the pharmaceutical industry's a bit like the nearly full concept. When I look at some of the things that I've seen registered in the past, by us, by other companies, they use--yeah, I can't really say us, because this is being recorded. Using five different methods to measure the same thing and registering things like that. And I really cannot see the point in that kind of approach.

But I think if I was to take a view of where are, we've got to start somewhere and I think because we're all children, really, at this game and there isn't that much experience built up, you've got to start, as I've said already, correlation to existing methods, et cetera, et cetera, to build up a confidence.

Gradually, as you move along, you'll realize that when you're controlling your process all your tablets are actually in spec, because you're looking at them statistically and you realize that that variable blend time you've got in there, which is accompanied by a certain algorithm is actually relevant and you can say that because you have all this evidence to prove it. And gradually looking at the spectra and a blend looking through a window will become the accepted primary method because people will know how to do it and it will have the same sort of background as HPLCs have for 20 years or whatever.

And I think you've got to approach it that way. You've got to learn to run--sorry, to crawl, before you can walk, before you can run. So, let's just be a little bit careful and take it nice and easy because there are a lot of goals to go for here. And, eventually, we will have methods that will become primary in their own right, which at the moment are certainly inferential and secondary.

DR. KIBBE: We have to do this again tomorrow. And I know what happens, at least someone at my age, if I sleep on something, I have to start all over again from scratch the next day.

But what I really think we've come away, at least coming to some kind of consensus that, first that the Agency needs only provide the general guidelines and the acceptability or the understanding that we're going to accept good solid data. You've got it, we're happy.

I think we need to have some more concrete information for us to look at as a group and debate to come with or refine what our guidance is going to be to the Agency. And because there are people here who seem to have their mind firmly wrapped around some of these concepts, what I was going to ask is this evening while you're dining and, maybe, watching a rerun of "Jurassic Park," that you do some things for us. And so, if you wouldn't mind writing a three or four sentence preamble that lays out validation of process analytical tools or technology in a general sense for us to look at.

And if Jerry would--he did such a good job of lists--I love his lists--if he could give us a working list, not complete and exhaustive, but a working list that we could suggest to the Agency as suggested things for the companies to look at as they go about validating both the process and their control mechanism or their technology, that would be a good place to start and then, I would wonder if there is any other aspect of it that someone would like to work on to bring to the table, so we could start to marry it all tomorrow.

Tomorrow, we're supposed to meet as a group and then break into our groups and continue our discussion and I've noticed--and I get paranoid about these kinds of things is that at some point we have to come to a consensus and prepare a summary, you see. And being good process analytical kind of person, I'd like to begin the process of preparing a summary as long in advance as we can. So, Tom, do you have any thoughts about, what, besides those two items might be added to our little gathering? I know you came up with a wonderful list of where PAT applied last time, and some other things.

MR. HALE: Yeah, I think two things that--and I don't know where it fits in your frame that the idea of the impact of specification writing on validation is important in that categorization, perhaps.

And the other thing is the thing that we don't do right now and haven't talked about is this idea of batch versus continuous processes because they're treated differently. And it gets back to the control and all that stuff, but it's not a--it's not a current validation concept that's widely in practice but it's the natural result of some of these things that come down the road.

DR. KIBBE: So, perhaps, you said you had some of your thoughts in hard copy back at your office. We could throw that into the pot, and then we have a wonderful assistant here.

MR. D'SA: I had a question for Jerry. You know, you mentioned about this multivariate calibration for continuous closed-loop--the ASTM criteria? Because that would be worth reviewing.

DR. WORKMAN: That's E165500, I don't have a copy with me, but I do have a--I do have a lot of the information .

MR. D'SA: Because some of the criteria that was used in that--in those multivariate calibration, especially for the criteria used to validate the instrument, itself, and then the criteria used for validation of the instrument for the intended use that maybe the guidance wants to tackle.

DR. WORKMAN: Okay, I can provide that at a later time or some of the information.

DR. KIBBE: We have a laptop tomorrow that we can--we not me, we, that is the--my father used it on me all the time when I was growing up. It's the we--we are going to clean the car, that didn't mean he, that meant me. So, I've learned to say that over and over again. We meaning you so that when we have these thoughts from our colleagues we could put them up on a projector and be able to see them and--all right, and I think that would really help us a lot because we're going to eventually have a summary made up of that kind of information that we'll share with the larger group.

MR. CHISHOLM: Can I make a suggestion?

DR. KIBBE: Yes, please, make suggestions.

MR. CHISHOLM: I think that something needs to be in here about the general principles that you want to be adopted and when we did the definition early on, we used the word timely, which I interpret as partly meaning statistically. So we have to take things like that into account, I think. Are we talking about statistical monitoring? Things like that, I think, have to go in the gate, but I think they're very relevant from a validation viewpoint. Because if you're actually monitoring throughout a batch, you're in a far safer position and you're doing it on a statistical basis than someone who's not doing that and your whole set of validation criteria might, therefore, be different. So I think we have to--

DR. KIBBE: So, those are two points you're going to bring with you tomorrow, right? Don't you love this--it's wonderful. The power of the chair. I've always wanted to be a chair in charge of brilliant people and I've managed to get it and it's just--it's going to my head, I can't believe--go ahead, Jerry.

DR. WORKMAN: There's one thing that really bothers me, still. There's more than one, but I'll just mention one. A lot of the discussion seemed to be that the assumption was made that all these great sensors are out there that you just plug in and they give great numbers. And, of course, that's not true. But let's say if it was--the problem I'm having in the thought process is--a protocol on how to use the information. You have all these great sensors, they're working, they're providing the information. What kind of a protocol or procedure or recommendation is in place on how to use that information. There's an information glut, they're can be. So how--

MR. HALE: I think that gets down to some sort of categorization or rationalization of how you're going to use the sensors. We add sensors--if only the process of adding sensors is what PAT means, we do that already. There's not a lot of difference except, perhaps, in complexity between a thermocouple and a NIR/IR, it's how you use it.

You can look at fluid-bed drying with thermocouple and air flow in a nice thermodynamic model and control it just as well as you can with a NIR/IR sensor, you just happen to measure different things and do it differently. So I think it gets back in the case of validation here of how you're specifically going to use the information.

DR. KIBBE: And also it gets back to what do you accept as a usable output. And when we talked last time about fingerprinting and the image of three-dimensional graph made from all that data and whether that image is superimposable or similar to, rather than looking at discrete data. And there's times when, if you go back to the days when I first learned how to formulate and hardness was the snap of the tablet in your ear when you snapped it, and now we have very sophisticated equipment that might not even get as good as some of the old formulators could at getting it right. So, you're right and I don't know the best way to approach that. But I know that we have to recognize that we're going to be swamped with data and we have to recognize there has to be a way of looking at that as a pattern instead of a datapoint.

And I'm hoping that information technology in the form of computational power is going to come parallel to where we're going with our sensors and that at some point that computational power will allow us to look at a sea of data at a reasonable time frame and decide whether the pattern is like the pattern was when the process was running well. And, therefore, we will continue to march because the pattern is correct. It's kind of like recognizing a rose the next rose you see, if it looks like a rose, it's a healthy rose, we keep going. And computational power will get there and then if we're lucky, around about 2015 they won't need us, the computational power will have passed us and they'll just tell us what we've got.

DR. WORKMAN: So, is it a goal to try to come up with some discussion, at least on how we--how the information will be used from these sensors?

MR. LEIPER: I certainly agree with Jerry, I think it's a goal. I think that the problem that we're in just now is that the data that we're generating, we can't actually correlate it with process performance or product quality. That's where we happen to be now, and we've got to move on and say, okay, other sensors might give us more information-rich data and it's going to be an integrated procedure. I--and I don't think that we're going to move from where we are now into tremendous information overload because I don't think that we can make that step change.

DR. WOLD: Just one thing more that I think we need to have in the validation and that is that the company should specify the infrastructure into which he puts this and show that it is reliable in some way because you can have the most beautiful equipment and if you can't take care of the data, store them and show them back in a reliably way, it's worth very little. And, also, the preparedness for things going down. That was discussed before, redundancy, some way of either diagnostics showing that the instrument will work for another day with high probability and detect when it's going down. So you are prepared for problems with the equipment and because with very multidimensional equipment you'll get into more serious problems when it goes down than with individual things.

DR. KIBBE: I think Jerry's point and your point are very well taken. And we need to have something in there that says at a minimum that we recognize that these are problems and that the company should have a way that they intend to approach those problems. I mean, we can't tell them how to store their data, but if they don't have a method, we're a little worried.

Anybody else have anything else, because I think we're at a stage where we can now, cogitate, modulate, ruminate if you're an herbivore and think about what we've done, and then tomorrow come back and put together something that I think might be useful for the Agency to move forward with.

DR. MARK: One think that I'm concerned about, Jerry, one part of what Jerry said and maybe the Agency can address it, because when I got involved with this in my work with Gary Ritchie in Purdue, we'd been working together on doing NIR for quite a while. And it started--one day he brought up the question of validation, which I'd never been involved with before. And we started talking about it a little bit. And I said, well, Gary, and Gary showed up and he can verify what I said, maybe he can even actually say it better than I can re--you know, rephrase his words.

I said to him, Gary, suppose, you know, we were to somehow get a calibration model, this for NIR, suppose we get a calibration model handed down by God so we know it's the right model for this stuff. And we went through all the validation exercises and we were able to show that this model, you know, passed perfectly, it was accurate, and it was linear and it was robust and everything else that the analysis had to be. Would the FDA accept it? That if we didn't, you know, make some kind of--you know, that we just, you know, did it from the data somehow, you know by magic or whatever you want. And he said, no. He said the Agency wouldn't accept it and the reason the Agency wouldn't accept it is because we would not have shown a causal relation between the known chemistry and physics and spectroscopy and what we were doing. You know, essentially, what Jerry was calling was in a black box, okay. We would not have shown a causal relation inside that black box, okay, it's an empty space there.

And, according to Gary, you know, and like I say, I may not be saying it right and maybe I'm not understanding it right, and maybe the FDA has, you know, a different view and what I'm saying isn't correct. And maybe FDA can address this a little bit now, to us. But he said the Agency wouldn't accept it for that reason. So, you know, the black box approach, as I understand it would not be satisfactory as of right now. Now, maybe as a result of these meetings and so forth, you know, it might be the situation now, maybe the Agency might change it's policy with regard to that--that if a, you know, it was completely validated, we could get by without the causal relation in some of these cases, but right now the understanding is that it would not be acceptable, so, that's a hole, I think in what we're doing here, which one way or another needs filled in.

DR. KIBBE: Joe, you or me?

MR. FAMULARE: Oh, I would just say we're here trying to understand the question fully, I think would be the fairest way to answer that. You may have some wisdom you want to shed before I make an attempt.

DR. KIBBE: Well, I mean, cause and effect relationships are few and far between. Correlation that's reliable and predictable and one predicts the outcome of the other and vice versa is about as good as I think we're going to get with most of these measures. To truly understand the cause and effect, then we have to understand end Bane physics and a number of black holes in the universe and a lot of other things that might not apply.

I think what Jerry's saying is very true. If we have a reasonably tightly defined system and even if we don't know every little change in the blend dynamics within the system, if we have good correlation between two measures and they predict somehow the uniformity of the outcome, I think we're going to have to live with that.

MR. FAMULARE: I think we live with a whole lot less today.

MR. LEIPER: I'll second t.

DR. KIBBE: You can only elaborate if you go to a microphone.

DR. ANDERSON: While Gary comes up to the mike, a comment on the whole black box idea. Black boxes are validatable, but validating a black box is problematic because you don't know precisely how to challenge that box. You don't know what the black box is susceptible to and you don't know how it is that what can go on in your process that you didn't account for early on that can change and affect you. So the less of a black box it is, the better, but that's not to say that a black box is invalidatable.

MR. RITCHIE: Yeah, along the lines of what Carl just stated and where Howard was going, what I was really trying to pinpoint in saying that here is an equation that arrived on my desk that says that this process does a certain thing. And that I could take that equation and measure that process repeatedly, the problem is I don't know where the equation comes from. How many parameters did I measure to come up with that equation. And it's not good enough for me to accept that, you know, 3 wavelengths or 3 factors explains what that process is doing and then I can take those 3 wavelengths back again and cough up a result.

I can't--I never could accept that, unless I could show that there was a measure of specificity or that I was repeating a result due to the combinations of two or more factors and I could do that over and over and over again and know, maybe I don't know every little molecular aspect of, let's say the powder blending, or fluid pumping, but I know that every time I do it, no matter whether it's 2 o'clock in the morning or 2 o'clock in the evening, that those two wavelengths accurately predict.

There is a measure of correlation. Now, whether I can say that there's cause and effect due to the physics and what not, I don't know to what level we have to go. I imagine Svante would be able to help me out with really what we're doing when we take those factors or when we take those wavelengths. I don't believe we're looking at physics, but I know that there is a measure of confidence. And that's really what I'm trying to get at. I don't know if I made it worse or better.

I think that's what the Agency's looking for from us is when we come to them--what is the confidence level? Where's the repeatability? What is it that you're saying this equation is doing?

DR. KIBBE: Thank you.

DR. WORKMAN: Well, before Svante addresses it, I was going to say something. Of course, if you treat it as a black box, you can look at standard samples--at one or many standard samples and determine if the black box is doing exactly what you think it's doing and what you said it was doing and what it originally was doing. So you can, you know, determine if that is functional.

And then when--if you look at full spectral data or full chromatographic data you can compare that shape and see if that's within the calibration space of the shapes that you've looked at before. If it's outside of that then, obviously, you have, you know, a problem. If it's inside of that, then you're interpolating if it's done properly and you know that you have some confidence in that result. That's my.

DR. WOLD: Yes, about this little black box. I think that it's two different issues. One is to say, as Jerry, that we validate it as a black box and that's a very nice thing to do because then we don't make any assumptions--we change things we can change and we see that it reacts in the way we want it to react. That doesn't mean that we believe it is a black box.

Now, I don't think that anyone here in the room is willing to accept a PAT or anything else, if we don't think that it is based on scientific principles and built according to our best scientific understanding. Then we know how to deal with it. And we know what to expect from it. So we don't have black boxes. But when we validate them it is at advantage to deal with them as if it were a block box. That's two different things.

DR. KIBBE: Thank you.

Go ahead, Tom.

MR. HALE: Can I ask a logistics issue?

DR. KIBBE: Yes, you can ask a logistics issue.

MR. HALE: On our homework--

DR. KIBBE: Yes.

MR. HALE: --if we bring it electronically, is that okay?

[Inaudible comment off microphone.]

DR. KIBBE: The answer is yes. They're working out the logistics of the logistics. Yes, if we bring it in electronically, he'll be able to work it in somehow, and you have a question about logistics.

DR. LO: I just want to say, anybody that wants to do this, I'd prefer electronic to my typing which is two fingers. So, please, electronics.

DR. KIBBE: Tomorrow, we are supposed to be called to order at 8:00 a.m. by Dr. Layloff, who will not be here, but I will and I will usurp his chairman's authority and call us to order tomorrow. And then, we'll be making the regional--Kathleen will make her statements, you know, how she says that none of us are biased because we don't know anyone else in the world. And then we'll go into our working groups and we'll continue to do this until we get close to lunch and then we'll be able to report back to the group. So, have a good evening folks and we'll look forward to tomorrow.

[Whereupon, at 4:47 p.m., the Subcommittee adjourned to reconvene at 8:00 a.m., Thursday, June 13, 2002.]

- - -