Does OSHA’s Injury Tracking Application Data Provide Reliable Representation of U.S. Company Injury and Illness Metrics?
Abstract
The Occupational Safety and Health Administration (OSHA) requires employers to investigate and document injury and illness cases in recordkeeping forms and recommends annual calculation of incidence rates as a measure of their safety program performance. Recent research literature calls into question the reliability and validity of OSHA recordkeeping data and its subsequent incidence rates (Hallowell, 2023).
The objectives of this study are to investigate the consistency and reliability of OSHA’s Injury Tracking Application (ITA) datasets and their relationship to the U.S. Bureau of Labor Statistics (BLS) incidence rates (Survey of Occupational Injury and Illness [SOII] reports, YEAR). Descriptive statistics tables and annual incidence rate charts for OSHA ITA datasets collected in 2016-2022 raised concerns about entry errors. Therefore, a three-step corrective/reductive analysis was employed to reduce those errors. The average number of entries removed per year was 12.9% (ranging from 10.6% to 14.3%). The corrective steps greatly improved the consistency and reliability of the dataset and reduced variable standard deviations. The most impacted/reduced variables were total employees (76.5%), total hours worked (46.7%), hours worked per employee (95.2%), and percent of zero injury reports (6.3%). The number of cases and days reported only reduced by an average of < 2.5%. These results are very different than the commonly proposed “under-reporting” issues for OSHA recordkeeping and BLS SOII reports. This study found that over-reporting of hours worked, or number of employees caused most of the inconsistencies in incidence rate representation.
Implications from this study indicate that the OSHA ITA system needs better management to improve employer self-reported 300A data. The validity and reliability of BLS SOII rates are also questioned and demand further investigation.
KEY WORDS: OSHA injury tracking application, OSHA recordkeeping data, safety metrics, lagging indicators, incidence rates
1. Introduction
Environmental health and safety (EHS) professionals are required by the Occupational Safety and Health Administration (OSHA) to investigate, document, track, and possibly report safety and health incidents that occur as part of assigned work. OSHA injury and illness recordkeeping data has long been the primary source for safety performance measurements for an organization’s safety program. Many organizations focus on injury and illness incidence rates and set annual safety program performance goals of zero lost time cases or a maximum incidence rate. The two primary injury and illness rate calculations used by EHS professionals and OSHA are the total recordable incident rate (TRIR) and days away, restricted, and transfer (DART) incidence rate.
The emphasis on OSHA recordable measures has greatly influenced the EHS profession for decades. Unfortunately, organizations focus too much on these rates and comparisons to similar industry averages. This results in efforts to look better than they should, typically resulting in under-reporting recordable injuries. Achieving below-average injury rates can lead to winning bids on projects and prestigious safety awards. Prospective suppliers and business partners may request a company’s TRIR or DART to evaluate their safety program as part of their business decision. This creates additional motivation for companies to under-report or reclassify incident cases to achieve lower injury and illness rates (Avetta, n.d.).
Many EHS professionals are evaluated by their employers based on the performance of the safety program they manage. It has long been assumed that OSHA recordkeeping measures are a direct result of a safety manager’s efforts and influence on the organization. Due to these pressures to lower TRIR and DART data, many EHS professionals may not be accurately reporting TRIR and DART rates or unjustly reclassifying injury and illness cases as non-recordable. By hiding true injury and illness data or not completing effective incident investigations, risks or causes of injuries are not being prevented, and organizations will be at greater risk of major incidents or disasters. Further, the employer may not be aware of reportable events that occurred because some workers do not report incidents due to concerns of losing their job or other opportunities such as a raise or promotion.
The TRIR metric, typically referred to as the “Recordable Rate,” is calculated by:
TRIR = (Number of OSHA recordable cases, Columns H+I+J of 300 log) x 200,000
(Total number of hours worked)
The DART rate, which represents a higher severity injury metric, is determined by:
DART = (Number of Lost-time cases, Columns H+I of 300 log) x 200,000
(Total number of hours worked)
According to BLS (2019), the TRIR and DART rate provides direct insight into a company’s past safety performance. The term “OSHA recordable” is basically what OSHA defines as an injury or illness that meets published criteria as it causes harm beyond first aid treatment (29 CFR 1904.7). Injury and illness incidents are investigated and, if deemed recordable, are entered into the OSHA recordkeeping forms 301, 300, and eventually 300A. Employers with fewer than 11 employees and certain low-risk industries are partially exempt from OSHA’s recordkeeping requirements (OSHA c.; OSHA d.). The list of partially exempt industries is based on the 2007 NAICS codes.
Since 2016, most employers required to keep OSHA forms must upload their 300A form data to the OSHA ITA. OSHA provides annual 300A data from the ITA site as downloadable files (CSV or MS Excel). This information is meant to help EHS professionals, researchers, academics, and OSHA evaluate the safety of a workplace, understand industry hazards, and implement worker protections to reduce and eliminate hazards—preventing future workplace injuries and illnesses (OSHA a.) The creation of the ITA, along with the BLS SOII report, creates an opportunity to compare what should be duplicate data sets (OSHA b.).
Businesses measure safety to make crucial, data-driven decisions to improve plant and construction safety management. The traditional approaches to measuring safety have been TRIR, DART, and fatality rates, which are driven by OSHA guidelines and industry experts. Researchers have started questioning the validity of OSHA data and numbers when comparing their organizations to OSHA TRIR and DART data (Hallowell, 2023). Therefore, many organizations’ safety professionals are urging others to look into true and accurate means of measuring safety performance, such as leading indicators (Oguz Erkal et al., 2023). The old adages “Garbage in, garbage out” and “The truth will set you free” come to mind when gathering and analyzing OSHA average numbers. But even BLS and OSHA admit to underreporting injury data as an issue. An evaluation of recordkeeping performance during OSHA’s national emphasis postulated that an up to 66% error rate in injury reporting is possible (Fagan and Hodgson, 2017). Based on these findings, the use of TRIR and DART can determine if there is additional evidence of underreporting and a possible correlation. Many safety professionals struggle to show how OSHA recordkeeping requirements lead to employers hiding or underreporting injuries (Wuellner and Bonauto, 2014). Completeness of data, along with accuracy and validity, are of crucial importance when dealing with national data. Studies have shown that incomplete data and an unclear data management process will result in poor decisions made by organizations (Kilkenny and Robinson, 2018).
Research literature increasingly demonstrates that safety program performance is not purely determined by severe injuries or recordable injury cases (Lander et al., 2011). Many studies have identified several employer misconceptions and non-compliant practices related to the OSHA record-keeping requirements (Fagan and Hodgson, 2017). The focus should be on what matters— preventive measures for safety. According to research studies, companies would rather under-report injuries to OSHA and avoid inspections (Manjourides and Dennerlein, 2019). Previous research has shown that up to 70% of the OSHA log can be under-reported (Rosenman et al., 2006). Many studies have identified several employer misconceptions and non-compliant practices related to the OSHA record-keeping requirements (Fagan and Hodgson, 2017).
This study’s objectives are to determine:
- If the OSHA ITA annual data contains minimal self-reported errors.
- If the OSHA ITA annual data is comparable to BLS SOII data.
2. Methods
This study is historical research or historiography, which attempts to systematically recapture the complex nuances, the people, meanings, events, and even ideas of the past that have influenced and shaped the present,” (Berg and Lure, 2012). The analytical methods used for this study are primarily quantitative, with some qualitative assessment of annual trends. The public data files are available through the OSHA ITA and the BLS SOII reports. OSHA ITA data from 2016-2022 was downloaded to Microsoft Excel spreadsheets for storage and analysis. Data mined from BLS SOII reports were added to the results spreadsheets created by the OSHA ITA analyses.
2.1 Study Design
This study design is ex post facto and relies on historical public records created and managed through federal government-funded agencies. The public records were collected by self-reporting of OSHA recordkeeping data by two separate (but similar) systems (Brent and Leedy, 1990). An Institutional Review Board (IRB) approval was not required because no human subjects were involved, and the downloaded data is considered a secondary analysis. All data used in this study is publicly available through the U.S. Department of Labor website.
The study population is organizations mandated to report (or upload) their annual 300A data to OSHA as part of the ITA program. Organizations excluded from this study include those exempt from OSHA recordkeeping requirements, locations outside of the U.S., and non-participants in the OSHA ITA program. A primary example of excluded populations are organizations with 10 or fewer employees and special industries under 29 CFR 1904.2. It should be noted that all organizations must report to OSHA any workplace incident that results in an employee’s fatality, in-patient hospitalization, amputation, or loss of an eye, but that is a separate report system from the OSHA ITA.
2.2 Variables and Measures
The two primary study variables are TRIR and DART, which were calculated from the OSHA ITA databases (Brent and Leedy, 1990). In 2024, OSHA overhauled the format of its website to make ITA data access easier for the public and include other OSHA 301 and 300 form data (OSHA b.)
It is important to note that within the new OSHA ITA data page, OSHA refers to potential quality issues within their data sets. In addition, the data should not be used as a sole source of measuring safety program performance. OSHA’s statement on data quality is:
“While OSHA takes multiple steps to ensure the data collected are accurate, problems and errors invariably exist for some establishments.” “Efforts are made during the collection cycle to correct submission errors; however, some remain unresolved.” “OSHA does not validate the employee or injury and illness counts reported by establishments.”
“Concluding that establishments are the “most dangerous” or the “least dangerous” solely based on whether they have the highest or lowest rates from these data would be inappropriate.” (Occupational Safety and Health Administration, n.d.).
The variable classifications were derived from the ITA Summary Data Dictionary. From the original OSHA ITA data, spreadsheet “heading row” variables were downloaded into Microsoft Excel spreadsheets. Those key variables include total number of entries, total number of employees, total hours worked, maximum of reported total hours, percent of zero injuries reported, total number of fatalities, total days away from work (DAFW), total days of job transfer for restriction (DJTR) cases, total other cases, total injuries, and a simple calculation of TRIR, DART, and fatality rates.
During data analyses, additional variables and criteria were defined and included to seek to understand errors in the data. These variables include hours worked per full-time equivalency (FTE) employees and TRIR and DART calculations based on individual entries vs. overall determinations.
The BLS SOII reports were manually reviewed, and data was captured to be compared to OSHA ITA data. Variable categories collected from BLS include TRIR, DART, and fatality counts and rates from 2016 to 2022.
2.3 Data Collection
Data was downloaded directly from the OSHA website into MS Excel files. BLS SOII data was entered manually into MS Excel files. Possible confounding variables such as “COVID-19-year 2020” were analyzed and accounted for. Although errors and discrepancies were identified during the analysis, these were not deemed confounding variables due to it being the focus of the study.
2.4 Data Analysis
The ITA CSV file for each year was converted into an MS Excel worksheet. Each year’s data was analyzed in its own MS Excel document, and each step of the analysis was performed in its own tab. The overall TRIR and DART were calculated using the total sum of injury and illness cases with the overall total hours worked. Initial attempts to calculate the hours worked per employee (dividing hours worked by the number of employees for each individual entry) indicated an issue/error due to reporting either zero hours worked or zero employees. This also prevents the calculation of individual TRIR and DART averages and medians. Table 1 indicates anomalies observed during this initial assessment. Thus, all the datasets were re-analyzed to make corrections to these errors.
While visually reviewing Table 1, an obvious error is found under the 2019 column. The total hours worked in 2019 were 1,000 times greater than the previous year, and overall TRIR and DART for 2019 were 0.02 and 0.01 DART, respectively. It is unreasonable to think that this is achievable for an industry average when compared to 2016, 2017, 2018, and 2020. In addition, the OSHA ITA database shows over a trillion hours worked per employee for a single entry in 2019. This is an obvious error. At the very beginning of the study, it was determined by researchers to pursue a deeper analysis of the “hours worked” entries and how it may affect the calculation and presentations of TRIR and DART.
Subsequently, a graph (Fig. 1) of TRIRs and DARTs for this data vs. BLS was created, further demonstrating that the original OSHA ITA dataset(s) may have inconsistencies and errors in its entries. To investigate these concerns, the following corrections were applied to all OSHA ITA spreadsheets:
- Remove entries with either zero employees or zero hours worked.
- Remove unreasonably low entries: <11 employees, <1,000 total hours worked per year, and <100 hours worked per employee per year.
- Remove unreasonably high entries: >3,250 hours worked per employee per year, and >500 TRIR.
Corrective Step 1 – Sorting by “DIV/0!” results under column header “HR/FTE”.
Employers reporting 0 employees and/or 0 hours worked caused the calculation of individual entry HR/FTE to result in an error, which MS Excel expresses as “DIV/0!”. This basically means the equation in that cell had a zero in the denominator. For the OSHA ITA data, if an organization self-enters “zero hours worked” or “zero employees,” they shouldn’t be uploading data to the system (or OSHA’s ITA should catch the error and seek resolution). Table 2 displays the updated results after the removal of obvious data entry errors that consisted of zero employees with zero hours worked.
Corrective Step 2 – Sorting “Number of Employees” column by Low to High, then Sorting “Hours Worked” by Low to High, and Sorting “HRS/FTE” by Low to High.
The next step consisted of three sorting tasks and the removal of unrealistic “low categories.” This began with removing entries with fewer than 11 employees. After removing all entries with fewer than 11 employees reported, it was determined that 1,000 hours per year, or an equivalent of 100 hours per employee, was the absolute minimum that could be accepted and should include most (if not all) consulting projects or emergency response projects which take over two weeks to complete.
Corrective Step 3 – Sorting “HRS/FTE” column by High to Low, then sorting “TRIR” by High to Low.
The third (and final) correction consisted of two sorting tasks and the removal of unrealistically “high categories.” The first step was performed on “Hours Worked per Employee (HRS/FTE)” sorting from “High to Low” and removing any entry with more than 3,250 hours. This limit was determined based on 62.5 hours per week for an entire year. The researchers believed this to be reasonable considering some projects or industries require over 150% of the typical 40-hour work week. The final task in this correction was sorting the TRIR from “High to Low.” The researchers decided to be very conservative and accept a TRIR of 500 or less (or 500 recordable cases per 100 FTE).
3. Results
The study objectives (and hypotheses) focused on the consistency and reasonability of OSHA’s ITA annual datasets, both on its own and as a comparison to BLS reports. Significant and noteworthy discrepancies were noted in the original OSHA ITA downloaded data (Table 1, Methods Section), especially in 2019 with the summed total hours worked and calculated overall TRIR and DART. Additionally, it was impossible to calculate individual entry hours worked per employee, TRIR, and DART due to “DIV/0!” errors in MS Excel. This investigation determined that some employers upload zero hours worked for the year and/or zero employees for the year, which led to the first corrective step of eliminating those entries. These discrepancies continued to be observed in the updated descriptive statistics table (Table 2, Methods Section). Two additional corrective steps were applied to render a final table of descriptive statistics (Table 4, Methods Section) that displayed some consistency from year to year with less variance over time.
3.1 How Did the Corrections Affect the Overall Datasets and Descriptive Statistics?
Table 5 displays the number of entries removed in each of the three corrective/reductive steps that were performed on each year’s OSHA ITA data download. Table 6 displays the percentage of entries removed in each step.
Based on Tables 5 and 6, the largest correction occurred after removing entries that reported fewer than 11 employees, fewer than 1,000 total hours worked in a year, and fewer than 100 hours worked per employee. On average, this corrective step accounted for 11.4% of entries removed from the analysis. Overall, an average of 12.9% of entries were removed from the originally downloaded OSHA ITA data.
Tables 7 and 8 display the final corrected variables. The average amount of employees worked removed per year was 580,000,000, and the total hours worked was 2,500,000,000,000 (2019 alone accounted for 17,000,000,000,000 total hours removed). The average total hours removed per year, without including 2019, was 39,700,000,000 (almost 40 billion hours worked per year). An average of 27,930 injury cases and 4,220 illness cases were removed annually, along with 407,803 days away from work and 418,769 days in restriction or job transfer.
Based on Tables 7 and 8, the largest correction occurred to the total number of employees reported for that year (average of 76.5% reduction) followed by the total reported number of hours worked (average of 46.7% reduction). It is also interesting that the number of “zero injuries” reported were reduced by an average of 17.6% (from an originally reported average of 35.8% to 29.5%). These results indicate that erroneous reporting of the number of employees and hours worked are the most egregious concerns since they represent most of the error reduction during the corrective removal of entries. It’s also interesting to note that the self-reported injury and illness cases were only reduced by an average of 2.4%. This finding also provides an opposing argument to the “under-reporting” of injury cases addressed in the introduction. Focused attention to proper reporting of the number of employees and actual hours worked has a potentially greater impact on the calculation of injury and illness incidence rates and, therefore, the validity of the representation of safety program performance.
3.2 How Did the Corrections Affect Hours Worked and Incidence Rate Calculations?
The calculated “hours worked per employee” was determined to be the most drastically skewed (or erroneous) descriptive statistic. Most notable is the discrepancy in 2019. After the third corrective step, the annual hours worked per employee averaged between a low of 1,600s and a high of 1,800s, which is within expectation for full-time workers.
Figure 2 displays the effect of over-reporting hours worked per employee for 2019 and how that error was corrected through the three steps described in the Methods Section. Figure 3 demonstrates the consistency (and reliability) of hours worked per employee from the second to the third (and final) step. Since the incidence rate calculations rely heavily on reliable hours worked, this demonstrates the serious inconsistencies and errors in the original OSHA ITA data. Through three corrective steps, the researchers were able to achieve reliability in hours worked per employee and, therefore, presumed validity of incidence rates.
Table 10 illustrates greater details with the calculation and presentation of hours worked per employee, TRIR, and DART after the corrective steps. The difference between “average” and “median” for all three variables may indicate a need for additional correction or the possibility that these population statistics do not meet the assumption of normality. Even though more consistency is seen from year to year, Table 10 demonstrates an effect in 2020 (the year of COVID-19) in which hours worked per employee decreased, and TRIR and DART percent zero increased. Additionally, in 2020, TRIR and DART averages and standard deviations increased while medians decreased. From a calculation perspective, increasing the number of zero entries into a sample will inevitably decrease the median. On the other hand, a decrease in hours worked per employee will increase the TRIR and DART results. Table 10 confirms the effect of “reporting no injuries” and “over-reporting of hours worked” on the reliability of incidence rate calculations.
Table 11 provides a comparison of TRIR and DART calculations (overall vs. by individual entry) and the magnitude of change after each step of correction. This provides the opportunity to visually assess how they calculate the TRIR and DART rates and compare them against BLS SOII data for 2016-2022.
Figure 4 confirms that TRIR and DART calculations after the three corrective steps are the best representation of the data because of their consistency over time.
Figure 5 demonstrates that calculating incidence rates using overall total reported injury and illness cases and total hours worked falls between rates calculated by average and median (and standard deviation) from each individual entry (or employer submission). Note that Fig. 5 used the dataset after all three corrective steps. The general trend from year to year appears to be similar between the average calculation by individual entry and the overall calculation, whereas the median calculation by individual entry trends differently. It’s also interesting to note that both TRIR and DART rate calculations tend to vary in similar fashion.
In Fig. 6, the corrected TRIR median (calculated by individual entry) and corrected DART median (calculated by individual entry) are the only OSHA ITA rates that are “close” to the BLS comparison rates, but they are not perfectly representative. After 2019, the OSHA ITA median DART is lower than BLS, and prior to 2020, the OSHA ITA median TRIR was higher than BLS.
In Fig. 6, the TRIRs calculated both overall and by individual entries increase over time and are substantially greater in magnitude than the BLS TRIR. Similarly, the DART calculations, both overall and by individual entries, are increasing over time and are substantially greater in magnitude than the BLS DART. Additionally, after 2019, these DART calculations are greater than the BLS TRIR.
4. Discussion and Conclusion
The objectives of this research study were to investigate:
- If the OSHA ITA annual data contains minimal self-reported errors.
- If the OSHA ITA annual data is comparable to BLS SOII data.
Based on the ad hoc discovery, corrective actions, and visual determinations using results tables and figures, the OSHA ITA annual data lacks validity and representation due to errors in entries and inconsistencies over time, as well as very limited comparability with BLS annual TRIR and DART data. Although not justified by statistical analysis (significance testing), comparative analysis using tables and figures over time clearly demonstrated these determinations. Possibly more interesting is the realization of erroneous reporting of hours worked, and in the case of OSHA ITA, the reporting of employees. If the hours worked per employee variable had not been calculated on an individual entry basis, the severity and complexity of OSHA ITA data errors may not have been so vividly clear. Most “data quality” research literature focuses entirely on the impact of under-reporting of injury and illness cases as a cause for invalid or unreliable incidence rates. This study found a greater impact of both under- and over-reporting of hours worked and/or number of employees, which directly contributes to the validity and reliability of incidence rate calculations.
Poor quality data entry will lead to unreliable data output. The information gathered by OSHA needs to be highly accurate and analyzed for that accuracy. Datasets within the ITA database must be checked for validity, completeness, and accuracy (Kilkenny and Robinson, 2018). New approaches to safety measurements are needed. In addition to statistical invalidity, the use of TRIR also does not describe why the performance–good or bad–was achieved and what can be done to improve. The results and findings of this study should leave organizations wondering whether they should continue using TRIR and DART as performance measures for their safety program. Meanwhile, the academic and professional community should consider alternative measures of safety performance that assess the actual safety system at a high frequency. Increasing the number of reliable measurements could drastically improve the stability, precision, and predictive nature of safety metrics. To be comparative, however, these metrics must be standardized and consistently reported (Oguz Erkal et al., 2023). In nearly every practical circumstance, it is statistically invalid to use TRIR to compare companies, business units, projects, or teams (Hollowell et al., 2021).
OSHA recordkeeping has been the influence of safety programs for years now. This research study seriously questions whether incidence rates are the best way to measure a safety program. The study demonstrates three different TRIR and DART calculations are possible from the same data, and that, over time, they do not behave in similar ways. Are organizations under-reporting and hiding injuries to lower TRIR? Or are they over-reporting hours worked to effectively lower their TRIR and DART rates? There is a definite lack of accuracy and reliability within the OSHA ITA records and additional questions about the quality of BLS data. The errors found during this analysis point to inaccuracies within the OSHA ITA database and representation of TRIR and DART reporting from OSHA. It has become a staple to compare organizations against one another. Using this measurement system creates confusion, misrepresentation, and an overall injustice to organizations and does not improve safety performance. Safety performance should be measured by leading indicators, employee involvement, and reporting the effectiveness of corrective actions.
OSHA ITA and its incidence rate calculations are a flawed and inaccurate measurement system as proven by this study. Future research needs to focus on quantitative analyses of dataset normality, effects of inaccurate/skewed data based on reporting company size, and possibly differences between state-run and federal OSHA agencies.
References
Berg, B.L. and Lune, H., Qualitative Research Methods for Social Sciences, 8th Ed. Pearson Publishing, England: Harlow, 2012.
Brent, E. and Leedy, P.D., Planning Your Research Project. Practical Research: Planning and Design, Teaching Sociology, vol. 18(2), no. 248. pp. 95-121, 1990.
Fagan, K.M. and Hodgson, M.J., Under-Recording of Work-Related Injuries and Illnesses: An OSHA Priority, J. of Safety Research, vol. 60, pp. 79-83, 2017.
Kilkenny, M.F. and Robinson, K.M., Data Quality: “Garbage In – Garbage Out,” Health Info. Manag.: J. of the Health Info. Manag. Assoc. of Australia, vol. 47(3), pp. 103–105, 2018.
Hallowell, M., Quashne, M., Salas, R., Jones, M., MacLean, B., and Quinn, E., The Statistical Invalidity of TRIR as a Measure of Safety Performance, Professional Safety, vol. 66(4), pp. 28-34, 2021.
Lander, L., Eisen, E.A., Stentz, T.L., Spanjer, K.J., Wendland, B.E., and Perry, M.J., The Near Miss Reporting System as an Occupational Injury Preventive Intervention in Manufacturing, American J. of Industrial Medicine, vol. 54(1), pp. 40–48, 2011.
Manjourides, J. amd Dennerlein, J.T., Testing the Associations between Leading and Lagging Indicators in a Contractor Safety Pre-Qualification Database, American J. of Industrial Medicine, vol. 62(4), pp. 317–324, 2019.
Murphy, P.L. et al., Injury and Illness in the American Workplace: A Comparison of Data Sources, American J. of Industrial Medicine, vol. 30, pp. 130-141, 1996.
Oguz Erkal, E.D. and Hallowell, M.R., Moving Beyond TRIR: Measuring and Monitoring Safety Performance with High-Energy Control Assessments, Professional Safety, vol. 68(5), pp. 26-35, 2023.
Oguz Erkal, E.D., Hallowell, M.R., and Bhandari, S., Formal Evaluation of Construction Safety Performance Metrics and a Case for a Balanced Approach, J. of Safety Research, vol. 85, pp. 380–390, 2023.
OSHA a., Recordkeeping—Detailed Guidance for OSHA’s Injury and Illness Recordkeeping Rule, Occupational Safety and Health Administration, accessed June 9, 2024, from https://www.osha.gov/recordkeeping/entry-faq.
OSHA b., OSHA Injury Tracking Application – Login page, Occupational Safety and Health Administration, accessed June 9, 2024, from https://www.osha.gov/injuryreporting/ita/.
OSHA c., 1904.1—Partial Exemption for Employers with 10 or Fewer Employees, Occupational Safety and Health Administration, accessed October 23, 2021, from www.osha.gov/laws-regs/regulations/standardnumber/1904/1904.1.
OSHA d., 1904.2—Partial Exemption for Establishments in Certain Industries, Occupational Safety and Health Administration, accessed October 23, 2021, from https://www.osha.gov/laws-regs/regulations/standardnumber/1904/1904.2.
Rosenman, K.D., Kalush, A., Reilly, M.J., Gardiner, J.C., Reeves, M., and Luo, Z., How Much Work-Related Injury and Illness is Missed by the Current National Surveillance System?, J. of Occupational and Environmental Medicine, vol. 48(4), pp. 357-365. DOI: 10.1097/01.jom.0000205864.81970.63
U.S. Bureau of Labor Statistics, How to Compute a Firm’s Incidence Rate for Safety Management, 2019.
Wuellner, S.E. and Bonauto, D.K., Exploring the Relationship Between Employer Recordkeeping and Underreporting in the BLS Survey of Occupational Injuries and Illnesses, American J. of Industrial Medicine, vol. 57(10), pp. 1133–1143, 2014.