Advertisement

Readability Metrics of Provider Postoperative Handouts in Urology

Published:September 02, 2020DOI:https://doi.org/10.1016/j.urology.2020.08.044

      Objective

      To examine the readability of postoperative urology handouts and assess for areas of improvement. We hypothesize that the majority of provider handouts exceed the National Institutes of Health recommendation of writing at a sixth-grade reading level.

      Methods

      We reviewed 238 postoperative patient handouts in the public domain representing United States academic and private practices. All handouts were categorized and re-formatted into text-only using Microsoft Word. A median reading grade was calculated using the Readability.io web application using Flesch-Kincaid Grade Level, Gunning Fog index, Coleman-Liau index, Simple Measure of Gobbledygook, and Automated-Reading Index. Word count was also assessed.

      Results

      Provider handouts were written at a median 9.3 grade reading level (range 5.8-14, IQR 8.45-10). A total of 15 (6.8%) handouts were written at a sixth-grade reading level, with only 1 (0.4%) handout written below the target. Six (2.7%) handouts were written at college-level. There were no significant differences between different subspecialties. Median word count was 509 (range 90-3796, IQR 361-738). Although a high word count may make it more difficult for patients to follow suggestions, the readability of each handout did not correlate with word count.

      Conclusions

      Our data show that over 93% of analyzed handouts failed to meet National Institutes of Health recommendations for grade level. Longer word counts did not correlate with higher reading levels. It will be important to assess patient satisfaction with handouts and to correlate the complexity of postoperative handouts with outcome, such as unplanned phone calls and unscheduled visits.
      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
      One-time access price info
      • For academic or personal research use, select 'Academic and Personal'
      • For corporate R&D use, select 'Corporate R&D Professionals'

      Subscribe:

      Subscribe to Urology
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

      1. What is Health Literacy? | Health Literacy | CDC. Published October 23, 2019. Available at: https://www.cdc.gov/healthliteracy/learn/index.html. Accessed March 18, 2020

      2. Toolkit for Making Written Material Clear and Effective | CMS. Available at:https://www.cms.gov/Outreach-and-Education/Outreach/WrittenMaterialsToolkit. Accessed March 18, 2020

      3. Mamedova S, Sparks D, Hoyer KM, Ralph J. Adult education attainment and assessment scores: a cross-national Comparison.:41.

      4. Making Health Communication Programs Work : A Planner's Guide.
        U.S. Dept. of Health and Human Services, Public Health Service, National Institutes of Health, [Office of Cancer Communications, National Cancer Institute], [2002], 2002 (Revised December 2001[Bethesda, Md.]Available at:) (Accessed March 18, 2020)
        • Badarudeen S
        • Sabharwal S
        Assessing readability of patient education materials: current role in orthopaedics.
        Clin Orthop. 2010; 468: 2572-2580https://doi.org/10.1007/s11999-010-1380-y
        • Stossel LM
        • Segar N
        • Gliatto P
        • Fallar R
        • Karani R
        Readability of patient education materials available at the point of care.
        J Gen Intern Med. 2012; 27: 1165-1170https://doi.org/10.1007/s11606-012-2046-0
        • Reagan KML
        • O'Sullivan DM
        • Harvey DP
        • Lasala CA
        Readability of patient information pamphlets in urogynecology.
        Female Pelvic Med Reconstruct Surg. 2015; 21: 63-65https://doi.org/10.1097/SPV.0000000000000152
        • Williams AM
        • Muir KW
        • Rosdahl JA
        Readability of patient education materials in ophthalmology: a single-institution study and systematic review.
        BMC Ophthalmol. 2016; 16https://doi.org/10.1186/s12886-016-0315-0
        • Modiri O
        • Guha D
        • Alotaibi NM
        • Ibrahim GM
        • Lipsman N
        • Fallah A
        Readability and quality of wikipedia pages on neurosurgical topics.
        Clin Neurol Neurosurg. 2018; 166: 66-70https://doi.org/10.1016/j.clineuro.2018.01.021
        • Mozafarpour S
        • Norris B
        • Borin J
        • Eisner BH
        Assessment of readability, quality and popularity of online information on ureteral stents.
        World J Urol. 2018; 36: 985-992https://doi.org/10.1007/s00345-018-2179-9
        • Cisu TI
        • Mingin GC
        • Baskin LS
        An evaluation of the readability, quality, and accuracy of online health information regarding the treatment of hypospadias.
        J Pediatr Urol. 2019; 15: 40.e1-40.e6https://doi.org/10.1016/j.jpurol.2018.08.020
        • Rooney MK
        • Morgans AK
        • Chen RC
        • et al.
        Educational material on prostate cancer screening is overly complex and fails to meet recommended layperson readability Guidelines.
        Urology. 2020; 135: 1-3https://doi.org/10.1016/j.urology.2019.10.006
        • Colaco M
        • Svider PF
        • Agarwal N
        • Eloy JA
        • Jackson IM
        Readability assessment of online urology patient education materials.
        J Urol. 2013; 189: 1048-1052https://doi.org/10.1016/j.juro.2012.08.255
        • Betschart P
        • Zumstein V
        • Bentivoglio M
        • Engeler D
        • Schmid H-P
        • Abt D
        Readability assessment of online patient education materials provided by the European Association of Urology.
        Int Urol Nephrol. 2017; 49: 2111-2117https://doi.org/10.1007/s11255-017-1695-7
        • Gunning R
        The Technique of Clear Writing.
        McGraw-Hill, 1971 (Available at: https://books.google.com/books?id=YqMRtAEACAAJ. Accessed March 18, 2020)
        • Kincaid JP
        • Fishburne Jr,
        • Robert P.R
        • Richard L.C
        • Brad S
        Derivation of New Readability Formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel.
        Defense Technical Information Center, 1975https://doi.org/10.21236/ADA006655
        • Coleman M
        • Liau TL
        A computer readability formula designed for machine scoring.
        J Appl Psychol. 1975; 60: 283-284https://doi.org/10.1037/h0076540
        • Chall JS
        • Dale E
        Readability Revisited: The New Dale-Chall Readability Formula.
        Brookline Books, 1995 (Accessed March 18, 2020)
        • Ley P
        • Florio T
        The use of readability formulas in health care.
        Psychol Health Med. 1996; 1: 7-28https://doi.org/10.1080/13548509608400003
        • Leonard Grabeel K
        • Russomanno J
        • Oelschlegel S
        • Tester E
        • Heidel RE
        Computerized versus hand-scored health literacy tools: a comparison of Simple Measure of Gobbledygook (SMOG) and Flesch-Kincaid in printed patient education materials.
        J Med Libr Assoc. 2018; 106https://doi.org/10.5195/JMLA.2018.262
        • Lonergan PE
        • Gnanappiragasam S
        • Redmond EJ
        • Fitzpatrick F
        • McNamara DA
        Write2me: using patient feedback to improve postconsultation urology clinic letters.
        BMJ Open Qual. 2019; 8https://doi.org/10.1136/bmjoq-2019-000721
        • Gaines T
        • Malik RD
        Readability of pelvic floor dysfunction questionnaires.
        Neurourol Urodyn. 2020; 39: 813-818https://doi.org/10.1002/nau.24286
        • Atalay HA
        • Çetinkaya G
        • Agalarov S
        • Özbir S
        • Çulha G
        • Canat L
        Readability and understandability of andrology questionnaires.
        Turk J Urol. 2019; 45: 171-176https://doi.org/10.5152/tud.2018.75272
        • Betschart P
        • Staubli SE
        • Zumstein V
        • et al.
        Improving patient education materials: a practical algorithm from development to validation.
        Curr Urol. 2019; 13: 64-69https://doi.org/10.1159/000499291