Managing Marketed Product Stability Programs

Transcription

Managing Marketed Product Stability Programs
Investigating Out of Specification
(OOS) and Out of Trend (OOT)
Results in Stability
IVT’s Stability Programs
December 8, 2010
Philadelphia, PA
Frank Diana
1
EFFECTIVE METHODOLOGIES FOR OOS
INVESTIGATION OF MARKETED PRODUCTS
INCLUDING OOS OCCURENCES DURING STABILITY
SAMPLE ANALYSES
• Introduction
• OOS Investigations
• OOS Results - Stability Studies
• Deficiencies
•Retest Examples
2
INTRODUCTION
References/Background

Judge Wolin Decision (2/93)
• Averaging OOS with Passing Results
• Discarding of Raw Data
• Multiple Retests with no Pre-Specified End Point
• Inadequacy of Failure Investigations
• Method Validation Deficiencies

Investigating OOS Test Results for Pharmaceutical
Production (FDA Guidance issued 10/2006)
3
INTRODUCTION
(Continued)
SOP(s) should include:

Investigation of OOS Results

Repeat Analysis/Retest/Resample

Release and Stability Samples need to be included

Analyst/Supervisor Responsibilities

Out of Trend or Profile Results

What is OOT?

Can be a separate SOP

Can be part of same SOP but different procedure
4
OOS INVESTIGATION
Analyst identifies an OOS result and alerts
supervision
 Informal lab. inspection including review of
notebook/ worksheets, chromatograms, testing
procedures, calculations, instrument(s) used,
glassware, reagents, standards, solvents, etc.
 Can include additional measurements on
sample/standard preparations used in original test
(e.g. re-injection, re-dilution, additional extraction).
This is not a retest, typically called re-analysis.
 Review results of other tests performed on the same
sample

5
ANALYTICAL LABORATORY CHECKLIST FOR ANALYTICAL ERRORS
YES NO N/A ANSWER THE FOLLOWING QUESTIONS
6
OOS INVESTIGATION
(Continued)

Assignable Analytical Cause
Must be demonstrated/justified.
Unsupported speculation is not acceptable.
Document (Analyst/Supervisor)
Repeat as originally tested
Replace original results

Procedural Error during the analysis – stop
analysis and document error

No assignable analytical cause – formal
investigation
7
OOS INVESTIGATION
(Continued)
OOS
Result
Lab
Investigation
Analyst/Supervisor
Yes
Assignable
Cause
Invalidate Original
Result/Document
Investigation
Repeat
Test
No
Formal
Investigation
8
OOS INVESTIGATION
(Continued)

Formal Investigation
•
Typically led by Quality
•
Batch folder review
•
Other lots / products affected
•
Manufacturing history (i.e. product, equipment,
etc)
•
Historical stability data
•
Actual or probable cause
•
Corrective actions and individuals responsible
9
OOS INVESTIGATION
(Continued)

Laboratory phase of Formal Investigation
• Retest / resample
• SOP in place
– Decision making process
– Number of sample preparations / number of
analysts predetermined
 specifically defined in SOP or protocol written
prior to retesting
 repeated testing until a passing results is
obtained is not acceptable (testing into
compliance)
– Confirmed / not confirmed
– Timeframes (30 business days)
10
OOS INVESTIGATION
(Continued)
Retest/
Resample
Invalid
Sample
Yes
Resample
Document
Repeat
Test
No
Retest based
on SOP/Protocol
Increased # of Replicates
OOS Clearly
Atypical
Yes
Invalidate
Document
Report
Results
Results
within Spec
No
OOS
Confirmed
11
OOS INVESTIGATION
(Continued)

Resampling
• Only when evidence that original sample was
prepared incorrectly
• Not representative of batch
• Justification for resampling must be fully
documented
• Qualified sampling
12
OOS INVESTIGATION
(Continued)

Averaging
• May be valid in some cases, i.e. micro assays
• Inappropriate when uniformity is being evaluated or
cannot be assumed, e.g. content uniformity, blend
uniformity
• To use averaged results for assay, all test results must
be within specs.
• Replicate injections vs sample result
– Replicate injections – averaging is acceptable
13
OOS INVESTIGATION

Outlier Tests
•
•
•
•
•
Homogeneous sample is tested, such as for assay
Not applicable for content uniformity, dissolution,
release rate
Use only on rare occasions with full justification
Statistical evaluation
See Reference for an interesting discussion on this
topic: JD Hofer and EC Richard, “A Guide to
Determining Sample Sizes and Outlier Criteria in
Analytical Investigations,” Pharmaceutical
Technology, March 2001; pages 52-66.
14
OOS RESULTS - STABILITY STUDIES

Similar investigation to release OOS

Significant change at accelerated conditions

OOS at accelerated or intermediate conditions

Understand typical product trends – identify significant
OOT results

Release versus stability specifications

Unknown peaks in chromatograms

Typical sample chromatograms should be reviewed
from previous time points

Standard chromatograms
15
OOS RESULTS - STABILITY STUDIES

Release problems that show up on Stability
•
High initial degradation product (impurity) level in
API
•
Atypical variability in tablet assays (content
uniformity)
•
pH near limits at release (analytical variability)
•
Stage 2 dissolution, or atypically low average
dissolution result
•
High moisture level in tablets (degradation)
16
OOS RESULTS - STABILITY STUDIES
 Stability Problems
•
•
•
•
Alternative storage condition
Change package – more protective
Shorter Expiration Date
Reformulate product
17
OUT OF PROFILE/TREND RESULTS

Alert or in-house limits
•
Result vs. previous data point
•
+ / - from the previous value
•
+ / - from label
•
Set value compared to specification
•
Goal is to identify a potential problem before it
becomes a major problem
•
Notification of supervision
18
Degradation product for 200 mg Tablet
Deg Prod
Product
Strength=200
0.6
0.5
0.4
0.3
0.2
0.1
0.0
0
12
24
36
48
Nominal Months on
Package
BLISTER Study
HDPE
Suspect
Type
blister
Suspect blister lots are LC129(200mg), NB081(200mg), and
NC169(300mg)
60
72
19
All data graphed by Package size
AC
1.5
1.4
1.3
1.2
1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
0
3
6
9
12
15
18
MONTHS
PKG
100
500
20
REGULATORY IMPLICATIONS

Field Alert/Interaction with Regulatory Authorities
•
GMP 3.14.18
(1) NDA-Field alert report. The applicant shall submit
information of the following kinds about distributed
drug products and articles to the FDA district office
that is responsible for the facility involved within 3
working days of receipt by the applicant.
(ii) Information concerning any bacteriological contn,
or any significant chemical, physical, or other change
or deterioration in the distributed drug product, or
any failure of one or more distributed batches of the
drug product to meet the specifications established
for it in the application.
21
Proactive Approaches to Minimize OOS
 Well thought out and written SOPs
 Justifiable/Reasonable Specifications
 Good analytical methods – method
development/validation/transfer
 Well trained analysts and supervision
• Follow SOPs
• Follow Methods
• Training Program/Experience
• Good Documentation Practices
• OOS/OOT are reviewed, investigated and
resolved ASAP
22
Proactive Approaches to Minimize OOS
 Equipment Program that is in control
 Sample flow is clear and documented
 Data are reviewed in a timely manner and trended
 Recurring problems are reviewed in a timely
manner, trended, as necessary and resolved
• Equipment
• Methods
• Analysts
• SOPs
• Standards
23
Proactive Approaches to Minimize OOS
 Internal audits performed periodically
 Periodic Group meetings
• Audit findings
• Recurring problems
• SOP training
• Metrics
• Discuss issues
24
DEFICIENCIES

Examples of FDA 483 citations; information obtained from
various sources (GMP Trends, Pink Sheets, Drug GMP
Report, Washington Drug Letter)

Investigations
•
Lack of thorough investigations; inadequate investigations
into OOS
•
Failure investigations are not conducted / completed in a
timely fashion
•
Not conducting timely stability investigations including
failures beyond the expiration date
25
DEFICIENCIES (Continued)

Investigations (Continued)
•
The company’s investigations did not extend to other
packages of the failed batches or to other batches
Manufactured at the same time
•
The batches placed on stability testing each year are a
representative sample of the batches manufactured
during that year; failure of a stability sample should
trigger a review or testing of other batches
manufactured during the year
•
Investigation report does not include the corrective
actions necessary to prevent similar recurrences.
Reports do not indicate if similar OOS results were
reviewed and if other lots are affected.
26
DEFICIENCIES (Continued)

Investigations (Continued)
•
There is no SOP for conducting product
investigations or tracking product failures
•
Several investigation reports indicate the corrective
action plan is to instruct analysts to follow procedures
more carefully, however all analysts involved had been
certified to conduct the analyses (training program
issue)
•
OOS results were invalidated without sufficient
data to support conclusions, such as poorly trained
analysts and equipment problems.
27
DEFICIENCIES (Continued)

Retesting
•
Original Assay Test data from an accelerated stability
timepoint was discarded without assignable cause and
without a written investigation. Reanalysis (same
samples, new standard prepared) was performed. The
results did not invalidate the original results. There
was no evidence that the original, accelerated stability
data was not legitimate. All of the results, original and
reanalysis, were within specification but generally
higher than the initial stability results.
28
DEFICIENCIES (Continued)

Retesting (Continued)
•
An assay result below the action limit, was
“invalidated” without any assignable cause
identified and solely on the basis of two retests.
The SOP states that unless the working solution is
retested, seven out of eight samples are required
to be similar in order to invalidate a sample as
non-representative. This written procedure is the
standard followed by your firm’s other
laboratories.
29
DEFICIENCIES (Continued)

Retesting (Continued)
•
The assay result was invalidated with the unsupported
assumption that an injector problem caused the low
results since higher assay results were obtained for
other lots run before and after. There was no
justification of why an injector problem was assumed
for the four consecutive injections for this lot (with all
four injections substantially agreeing) and why no
other results from this HPLC run were questioned.
30
DEFICIENCIES (Continued)

Retesting (Continued)
•
The procedure for rejection of analytical data and
reassay of samples is inadequate, in that there is no
information explaining in detail how the retesting is
conducted. There is also no information explaining at
what point testing should end and the product be
evaluated.
•
There was no communication between the firm and the
contract laboratory outlining retesting procedures to
follow for samples that fail to meet specifications.
31
DEFICIENCIES (Continued)

Retesting
•
Failure to investigate, justify and record deviations
from written specifications and test procedures when
repeated testing is done, due to initial test failures.
•
Averaging passing and failing results without
investigating cause of OOS
•
Discarding test data without just cause.
32
DEFICIENCIES (Continued)

Retesting/Resampling
•
Lack of personnel training in laboratory
procedures and stability testing is evident from the
discarding of raw analytical data, samples tested
several times and supervisor personnel allowing
such events to occur.
•
The firm failed to follow their own SOP’s in regard
to the resample and the firm did not address this
in the investigation report. The investigation
implies that a subsequent sample, the resample,
was obtained using correct sampling procedures.
The firm has no documentation as to when the resampling actually occurred.
33
DEFICIENCIES (Continued)

Trending / Review of Stability Data / Notification
•
There has been no investigation of the increase in
particulate matter test results during stability testing
through 24 months.
•
The firm consistently found dissolution results that
had high variability of over eight percent relative
standard deviations and / or average results that
differed from the average assay result by more than
ten percent. No investigation was performed
although it was stated on the company’s SOP that
values of dissolution test should compare favorably
with assay values.
34
DEFICIENCIES (Continued)

Trending / Review of Stability Data / Notification (Continued)
•
There are no standard operating procedures to conduct
investigations or statistically analyze real time stability data
points that exhibit abnormal stability patterns. Data
submitted in the Annual Reviews demonstrates that several
products have unexpected increases in potency that exceed
normal data variability. For example:
–
Lot ... increased 12% in potency between 0 and 3 months.
Lot ... increased 6% in potency between 0 and 3 months,
decreased by 9% between 3 and 6 months.
–
Lot ... decreased by 7% between 0 and 3 months, remained
constant between 3 and 12 months then decreased 6%
between 9 and 18 months
35
DEFICIENCIES (Continued)

Trending / Review of Stability Data / Notification
(Continued)
•
There is no system in place to assure that upper
management is notified of product deficiencies.
•
Supervisory review of analytical data is not performed
in a timely manner
•
No investigation was done to determine whether
highly variable and/or non-linear test results at
various stability test stations were due to true product
variability or to defects in the testing.
36
RETEST EXAMPLES - DIFFICULT DECISIONS

Assay Results - 12 months/25 C/60%RH Timepoint

Results:
99.7% of label
88.9% of label

Limits:
90.0 - 110.0%

Previous Data:
98.5 - 101.8%

The analyst has just alerted the supervisor of these results


What are the next steps that should be followed?
What if the original results were 94.1 and 92.9% of label

What are the next steps that should be followed?
37
RETEST EXAMPLES - DIFFICULT DECISIONS
(1)
Assay Results - 12 months/25 C/60%RH Timepoint
Original:
99.7% of label
88.9%
Limits:
90.0 - 110.0%
Previous Data:98.5 - 101.8%

No analytical error found

Retest (7 replicates) 100.8, 99.1, 99.4, 99.7, 99.8, 100.5, 100.4

What decision would you recommend?
 Invalidate 88.9%?
38
RETEST EXAMPLES - DIFFICULT DECISIONS
(2)
Assay Results - 24 months/25 C/60%RH Timepoint
Original:
91.5% of label
89.4%
Limits:
90.0 - 110.0%
Previous Data:92.0 - 99.5%

No analytical error found

Retest (7 replicates) 91.7, 90.6, 91.8, 90.3, 91.2, 90.2, 91.5

What decision would you recommend?

Can 89.4% be invalidated?

Field Alert?
39
RETEST EXAMPLES - DIFFICULT DECISIONS
(3)
Assay Results - 12 months/25 C/60%RH Timepoint
Original:
99.7% of label
88.9%
Limits:
90.0 - 110.0%
Previous Data:98.5 - 100.8%

During lab investigation, re-shake and re-inject

Result: 99.4%

Analytical cause identified

Next steps?

Assume this has happened several times in the past
40
RETEST EXAMPLES - DIFFICULT DECISIONS
(4)
Content Uniformity
Uncoated tablets - release testing
Ten results:
99.9, 71.4%, 99.4%, 98.7, 97.5, 101.6,
98.9, 101.2, 99.5, 100.7

No analytical error found

Can 71.4 be invalidated?

What Lot disposition will you recommend?
41
RETEST EXAMPLES - DIFFICULT DECISIONS
(5) The firm consistently found dissolution results that had high
variability of over eight percent relative standard deviations
and / or average results that differed from the average assay
result by more than ten percent. No investigation was
performed although it was stated on the company’s SOP that
values of dissolution test should compare favorably with
assay values
•
•
•
•
•
Analyst results for dissolution time point (18 months) is 110% +/- 6%
Stability results for dissolution for this lot range from 94 – 97 % at
earlier time points
Other lots have had issues consistent with deficiency above
Assay result is 99.2, 99.7% of label
What are your next steps?
42