| Title | Practices of school-based speech-language pathologists assessing language impairment in English learning children |
| Publication Type | thesis |
| School or College | College of Health |
| Department | Communication Sciences & Disorders |
| Author | Ketchoyian, Alexandra Rose |
| Date | 2016 |
| Description | The purpose of the present study was two-fold. The first aim was to collect preliminary data regarding whether or not Speech-Language Pathologists (SLP) in northern Utah adhere to federal, state, and professional mandates when conducting initial assessments of English learning (EL) children. The second aim was to gather information via survey about SLPs' assessment practices, training, and their confidence in assessing children, regardless of language status, suspected of language impairment. The first aim was addressed by conducting a systematic review of the speech-language assessment reports along with any supporting documentation in two districts and one charter school in northern Utah. Results of the file review clearly demonstrated that SLPs did not adhere to federal, state, and professional mandates regarding appropriate assessment of English learning children. This "nonadherence" consisted of SLPs over-reliance of standardized measures, lack of developmental history information from parents, and the absence of informal measures. Survey results revealed that SLPs tended to use a one-size-fits-all approach to language assessment regardless of whether the child was EL or monolingual English speaking. In terms of confidence in assessing EL children, all SLPs whether bilingual or monolingual English speakers, reported feeling more confident conducting assessments of monolingual English-speaking children. Although the SLPs surveyed reported having knowledge of the more commonly promoted informal measures, evidence of implementation was minimal. As a consequence, English learning children assessed for language impairment seemed to have been assessed using measures that do not align with recommended practices set forth by legal and professional organizations. |
| Type | Text |
| Publisher | University of Utah |
| Subject | Assessment procedures; English learning; Language impairment |
| Dissertation Name | Master of Science in Speech-Language Pathology |
| Language | eng |
| Rights Management | ©Alexandra Rose Ketchoyian |
| Format | application/pdf |
| Format Medium | application/pdf |
| Format Extent | 293,091 bytes |
| Identifier | etd3/id/4230 |
| ARK | ark:/87278/s6v15d48 |
| DOI | https://doi.org/doi:10.26053/0H-654K-51G0 |
| Setname | ir_etd |
| ID | 197775 |
| OCR Text | Show PRACTICES OF SCHOOL-BASED SPEECH-LANGUAGE PATHOLOGISTS ASSESSING LANGUAGE IMPAIRMENT IN ENGLISH LEARNING CHILDREN by Alexandra Rose Ketchoyian A thesis submitted to the faculty of The University of Utah in partial fulfillment of the requirements for the degree of Master of Science Department of Communication Sciences and Disorders The University of Utah August 2016 Copyright © Alexandra Rose Ketchoyian 2016 All Rights Reserved The Univers i ty of Utah Graduate School STATEMENT OF THESIS APPROVAL The thesis of Alexandra Rose Ketchoyian has been approved by the following supervisory committee members: Robert Kraemer , Chair 05/09/2016 Date Approved Sean Redmond , Member 05/09/2016 Date Approved Pamela Mathy , Member 05/09/1990 Date Approved and by Michael Blomgren , Chair/Dean of the Department/College/School of Communication Sciences and Disorders and by David B. Kieda, Dean of The Graduate School. ABSTRACT The purpose of the present study was two-fold. The first aim was to collect preliminary data regarding whether or not Speech-Language Pathologists (SLP) in northern Utah adhere to federal, state, and professional mandates when conducting initial assessments of English learning (EL) children. The second aim was to gather information via survey about SLPs' assessment practices, training, and their confidence in assessing children, regardless of language status, suspected of language impairment. The first aim was addressed by conducting a systematic review of the speech-language assessment reports along with any supporting documentation in two districts and one charter school in northern Utah. Results of the file review clearly demonstrated that SLPs did not adhere to federal, state, and professional mandates regarding appropriate assessment of English learning children. This ‘nonadherence' consisted of SLPs over-reliance of standardized measures, lack of developmental history information from parents, and the absence of informal measures. Survey results revealed that SLPs tended to use a one-size-fits-all approach to language assessment regardless of whether the child was EL or monolingual English speaking. In terms of confidence in assessing EL children, all SLPs whether bilingual or monolingual English speakers, reported feeling more confident conducting assessments of monolingual English-speaking children. Although the SLPs surveyed reported having knowledge of the more commonly promoted informal measures, evidence of implementation was minimal. As a consequence, English learning children iv# # assessed for language impairment seemed to have been assessed using measures that do not align with recommended practices set forth by legal and professional organizations. # TABLE OF CONTENTS ABSTRACT ....................................................................................................................... iii LIST OF TABLES ............................................................................................................ vii INTRODUCTION .............................................................................................................. 1 REVIEW OF THE LITERATURE .................................................................................... 6 Formal assessments ................................................................................................. 6# Clinical training .................................................................................................... 10# Informal measures ................................................................................................. 11# Use of interpreters ................................................................................................. 18 Factors influencing SLP assessment practices ...................................................... 20 Predictions ............................................................................................................. 22 METHODS ....................................................................................................................... 24 File review ............................................................................................................ 24 SLP survey ............................................................................................................ 25 RESULTS ......................................................................................................................... 27 Question 1 ............................................................................................................. 28 # Question 2 ............................................................................................................. 32 Question 3 ............................................................................................................. 35 Question 4 ............................................................................................................. 39 Question 5 ............................................................................................................. 41 DISCUSSION ................................................................................................................... 45 Low levels of compliance with inclusion of informal assessment procedures and parent information ................................................................................................. 47 Lack of variation in perceived level of qualification and participation in graduate coursework/clinical training .................................................................................. 49 Caseload size and district mandates ...................................................................... 49 Overall confidence levels ...................................................................................... 50 ................................................... vi# # CONCLUSION ................................................................................................................. 52 APPENDICES A: QUESTIONS GUIDING FILE REVIEW ................................................................... 54 B: SLP SURVEY QUESTIONS ....................................................................................... 56 REFERENCES ................................................................................................................. 62 # STUDY LIMITATIONS .................................................................................................. 51 ## LIST OF TABLES Tables 1 Results from review of speech-language assessment reports ........................................ 30 2 Commonly used tests extracted from file review .......................................................... 33 3 Survey responses for participation in education about assessment of EL children ....... 36 4 Survey responses for inclusion of NWR tasks in the assessment process ..................... 36 5 Survey responses for inclusion of information about typical developmental milestones ...................................................................................................................... 37 6 Survey responses for inclusion of information about presence of family history ......... 37# 7 Survey responses for perceived level of qualification assessing EL children ............... 38# 8 Pressure to use standardized tools .................................................................................. 40# 9 District mandates regarding use of standardized tools .................................................. 40# 10 Average caseload size per district ................................................................................ 42# 11 Overall confidence levels in conducting assessments ................................................. 42 12 Monolingual SLPs' confidence levels in conducting assessments .............................. 43 13 Bilingual SLPs' confidence levels in conducting assessments .................................... 43 14 SLPs' opinion regarding misidentification of EL children for language ..................... 43 # # ## INTRODUCTION The United States (U.S.) continues to be one of the most diverse countries in the world in terms of cultural, religious, and linguistic variation. It is estimated that over 350 languages are spoken in the U.S (Ryan, 2013). Results from a survey completed by the U.S Census Bureau in 2011 showed over 60 million people over the age of 5 spoke a language other than English in their home, making up 21% of the total population. The greatest increase of culturally and ethnically diverse individuals is seen in the Latino population. According to the 2014 U.S. Census Bureau population projections, the number of U.S. Latino residents is expected to double in size during the next 25 years. According to the National Center for Education Statistics (NCES, 2012), the number of Latino students increased from 8.6 million to 12.1 million between 2002 and 2012. As such, 79% of the school-aged English learning (EL) Latino children in the U.S. education system speak Spanish as their home language (Goldberg, 2008; Peña, Gillam, Bedore, & Bohman, 2011; U.S. Department of Education, 2008). Utah is the home to increasing numbers of foreign-born individuals. Between 1980 and 2010, the foreign-born population in Utah increased from 3.5% to 8.1% of the state's overall population. In addition, Utah has the eighth fastest growing EL population in the U.S. The Census Bureau estimates that 13% of Utah residents between the ages of 5 and 17 speak a language other than English at home. Although the majority of EL children attending schools in Utah speak Spanish as their first language (83%), Navajo 2 # ## (3%), Vietnamese (2%), Somali (1%), Tongan (1%), as well as other languages are spoken (Fair, 2012). In consideration of the growing numbers of EL children entering and attending U.S. schools, there exist federal, state, and professional policy and guidelines to ensure equal access to educational opportunities for these children. It is important for educational professionals including Speech-Language Pathologists (SLPs) to abide by these guidelines, particularly during the assessment process when identifying children for access to special education services. In the field of Speech-Language Pathology, guidelines for the appropriate assessment of EL children suspected of language impairment have been well defined and documented. These guidelines are available on the American Speech-Language-Hearing Association's (ASHA) website (http://www.asha.org/Practice-Portal/Professional-Issues/Bilingual-Service- Delivery/ASHA, 2015). Whereas ASHA provides SLPs with the technical aspects of appropriate service delivery (e.g., assessment and intervention) to children with speech and language impairments, the overarching policy guiding practices of educational professionals (including SLPs) is the Individuals with Disabilities Improvement Act (IDEIA, 2004). IDEIA provides guidelines pertinent to special education professionals (including SLPs) working with EL children regardless of the child's ethnic and/or linguistic background. The Utah State Office of Education's (USOE) policies align with those of IDEIA in providing guidelines to individuals working with EL children. 3 # ## IDEIA, USOE, and ASHA policy and guidelines IDEIA (2014) provides recommendations to individuals responsible for assessing children with suspected speech-language impairments regardless of ethnic and/or linguistic status. IDEIA states that assessments and other evaluation materials be administered "in the form most likely to yield accurate information on what the child knows and can do academically, developmentally, and functionally." Hence, to comply with IDEIA policy, an assessment must include thorough investigation of a child's home language proficiency as well as his/her proficiency in their second language needed for academic success. In the state of Utah, educational professionals are expected to provide "measurable information about Utah students' core knowledge, skills, and abilities; acquired through high quality valid and reliable assessments" (USOE, 2010 http://www.schools.utah.gov/assessment/). This requirement includes SLPs assessing children suspected of speech or language delay. With regards to EL children in the public school system, the state of Utah operates under the same guidelines and regulations as those of the U.S Department of Education Office for Civil Rights (2000). It is considered to be a violation of Title VI of the Civil Rights Act of 1964 if children are being misidentified as having special education needs due to their lack of English language proficiency. Additionally, it is required that educational professionals include parents in all decisions regarding a child's participation in programs or special services by ensuring that important information be relayed in their home language (U.S. Department of Education Office for Civil Rights, 2000). Consequently, to comply with USOE policy, SLPs must use "high quality valid 4 # ## and reliable" (USOE, 2010) assessment tools that will not result in misdiagnosis of EL children as having a language impairment versus a language difference. They also must include parents in the assessment and treatment process, regardless of the parents' home language. For parents whose primary language is not English, SLPs must use interpreters to give parents the opportunity to make well-informed decisions and provide input about their child. In addition to federal and state policies, SLPs should adhere to ASHA's Key Points of Bilingual Service Delivery (2015), which states that SLPs must employ culturally and linguistically adapted test equivalents in both languages to compare potential deficits (ASHA, 2015). Whereas IDEIA and USOE provide general guidelines for SLPs responsible for assessing EL children suspected of language impairments, ASHA provides detailed guidelines and evidence-based methods. These evidence-based practices consists of the following: The use of interpreters in all aspects of the assessment, inclusion of nonstandardized assessment procedures such as dynamic assessment, language sampling, and narrative assessment, and inclusion of in-depth case history information and parent/caregiver questionnaires. In sum, recommendations from IDEIA, USOE, and ASHA policies and guidelines clearly state that speech-language assessment practices must be conducted in an EL child's home (or dominant) language(s) as well as in English. In 2007, the NCES reported that 11 million elementary and secondary students attending U.S public schools spoke a language other than English in the home. In that same year, over 2.5 million non-White (i.e., Black, Hispanic, Asian, Pacific Islander, American Indian, and Alaska Native) children were enrolled in special education services 5 # ## (Aud, Fox, & Kewal-Ramani, 2010). Although a specific number was not reported, it can be assumed that a portion of these children were English learners. Of those assessed for speech-language services, little is known whether or not they presented with a language impairment or a language difference. Given the increasing rate of EL children entering the Utah public schools, the identification of language impairments in EL children poses considerable challenges for both monolingual English speaking and bilingual SLPs. Aside from a possible lack of familiarity with IDEIA, USOE, and ASHA guidelines, additional challenges SLPs face during the assessment process include 1) The lack of norm-referenced standardized bilingual language assessments, 2) the lack of confidence and assessment experience with EL children from a range of language and cultural backgrounds, and 3) lack of familiarity and experience in using informal, non-standardized language measures (Caesar & Kohler, 2007; Kraemer et al., 2013; Paradis, Schneider, & Duncan, 2013). In consideration of these potential challenges, it is unknown whether school-based SLPs working in northern Utah conducted nonbiased assessments of EL children suspected of language impairment. The factors influencing the SLPs' chosen assessment practices and their levels of confidence are also unknown. # # ## REVIEW OF THE LITERATURE Formal assessments Currently, the eligibility requirements for children with suspected language impairments to receive services in the public schools is dependent upon norm-referenced test results (Kraemer et al., 2013; Roulstone, Peters, Glogowska, & Enderby, 2008). These assessments are designed to compare the child's abilities to the normative sample to identify those functioning below the cutoff score. For Spanish-speaking children, there are several tests that have been translated from their original language, which is helpful in giving the child the opportunity to access their home language to respond to test stimuli, but such tests may not provide a comprehensive representation of the specific language's phonological, morphological, and syntactical characteristics (Gutiérrez-Clellen, & Simon-Cereijido, 2009). These tests also have questionable psychometric validity because the normative sample (monolingual English-speaking children from the U.S., for example) is not representative of the population being tested, i.e., bilingual Spanish/English-speaking children (Langdon & Cheng, 2002). This threat to psychometric validity requires standard scores be interpreted with extreme caution and be presented only as a qualitative or observational analysis of the child's speech or language level (Langdon & Cheng, 2002). In addition, when determining whether a standardized measure will be helpful for diagnosis of language impairment, it is important to consider the specific linguistic sub- 7 # ## domains that are achieved during acquisition of a second language. Paradis, Schneider, and Sorensen Duncan (2013) investigated the performance of typically developing (TD) EL children and EL children with language impairment on monolingual English tests examining vocabulary, morphology, story grammar, and nonword repetition. Results revealed no difference between TD and language-impaired children's scores on the vocabulary subdomain (determined by results on the Peabody Picture Vocabulary Test- Third Edition). However, differences were observed from the tests examining the other linguistic subdomains. These results indicate that use of vocabulary tests (even in the child's second language) is not helpful for differentially diagnosing EL children as TD or language-impaired. Currently, the Bilingual English Spanish Assessment (BESA; Peña, Gutiérrez- Clellen, Iglesias, Goldstein, & Bedore, 2014) is the only psychometrically sound standardized test for assessment of bilingual Spanish/English speaking children. Reliability values in terms of internal consistency have alpha coefficients for ages 4-6 above .8 with the exception of English semantics for ages 5 and 6, which are still considered acceptable at .8 and .75. Interrater reliability is also high at 95% or above for all subtests in both languages. English morphosyntax and semantics subtests have moderate to significant correlations with language sample and standardized tests measures (values ranging from .345 to .818) demonstrating content and construct validity. Spanish morphosyntax and semantics subtests correlate with language sample measures and Expressive One Word Picture Vocabulary Test-Bilingual raw scores (ranging from .284-.406). The BESA incorporates test stimuli in both Spanish and English and allows for code switching between both languages. The test provides 8 # ## composite scores representative of a child's strengths and weaknesses in both languages. Most importantly, the BESA was developed and normed using bilingual Spanish-English speaking children living in the U.S. of varying cultural backgrounds. Although the BESA is a step in the right direction in terms of providing an appropriate norm-referenced assessment for Spanish-English EL children, there are two minor limitations: 1) the limited age range (4 through 6 years and 11 months, with the norms only being reliable for age 5) omits a large portion of school-aged children who are often misidentified due to the lack of valid assessment tools, and 2) the continued need for trained Spanish- English speaking individuals to administer the Spanish portion of the test. Whether administering a test translated from English into a child's home language or a standardized tool, such as the BESA, it is recommended that SLPs be proficient in the child's home language. In addition, the individual must also demonstrate an understanding of the test-jargon related to the assessment tools. It is also vital that those assessing EL children possess knowledge of both typical and atypical second language acquisition. Ninety-five percent of SLPs are monolingual English speakers (ASHA, 2014 www.asha.org/uploadedFiles/Demographic-Profile-Bilingual-Spanish-Service- Members.pdf). The vast majority of professionals would require recruitment and use of a highly trained interpreter, which may be a challenge given the minimal allotted resources and time required to provide training. Consequently, many professionals resort to using English-only standardized measures when assessing EL children on their caseloads (Caesar & Kohler, 2007). A survey by Caesar and Kohler (2007) investigated the tests most commonly administered by SLPs (N=130) to EL children suspected of language 9 # ## impairment. Ninety-eight percent of the respondents reported administering English-only measures as the only measurement for language impairment. This practice is particularly problematic not only because of the threat to psychometric validity, but because EL children are not given the opportunity to fully demonstrate their bilingual skills and, as such, are likely to be penalized for having low English proficiency. Thus, EL children assessed via English-only tests tend to score lower than their monolingual peers (Bialystok, Craik, & Luk, 2008) and consequently may be misidentified as language impaired. This type of assessment practice has continued to lead to the misidentification of language impairments in school-aged EL children (Paradis, 2005). Such misdiagnoses lead to unnecessary provision of treatment services, contribute to large caseloads for school-based SLPs, and result in unwarranted use of healthcare resources. As previously demonstrated by Artiles, Rueda, Salazar, and Higareda (2005), a disproportionately high number of Latino children were represented in special education and associated services compared to their non-Latino peers. As awareness of this problem has increased, this trend shifted towards an under-diagnosis of EL children, fueled by the explanation that observed language deficits were attributed to difference, not disorder (Artiles et al., 2005). A recent, highly statistically controlled study by (Morgan et al., 2015) revealed Latino children to be significantly underrepresented in speech and language services in schools across the U.S, leading to decreased access to special education services for these children. Thus, both clinical and teacher training programs must focus on remedying the problem of misdiagnosis by developing and implementing both formal and informal assessment procedures. 10 # ## Clinical training Hammer, Detwiler, Detwiler, Blood, and Qualls (2004) surveyed SLPs in various geographical regions of the U.S. and reported that the majority of the 213 participants did not feel confident in evaluating children of Latino heritage whose home language was Spanish. It was also found that about one-third of the respondents in this study did not feel that they had received adequate training in the topic of assessment and treatment of multicultural students. Results from this study were consistent with data collected in a study conducted over 2 decades ago by Campbell and Taylor (1992), which reported that 83% of their participants did not feel competent in assessing the EL children on their caseloads. A similar study by Caesar and Kohler (2007) investigated the level of preparedness SLPs felt when entering the profession. These researchers demonstrated that only 28% of the 130 survey respondents in this study reported that they had been provided with adequate training and knowledge for performing language assessments of EL children. The discrepancy between the increasing population of EL children in the U.S. and the relatively stagnant confidence levels felt by the SLPs assessing these children is a cause for concern. In addition to their survey, Caesar and Kohler (2007) examined the specific assessment tools used by SLPs when assessing EL children suspected of language impairments. They discovered that SLPs relied on formal, standardized assessment tools to determine eligibility for speech and language services. The most frequently used form of informal assessment was language sampling (33% of participants), and informal observation was only used by 10% of the respondents. There was no mention of use of 11 # ## other informal procedures such as dynamic assessment, nonword repetition, or narrative analysis. Informal measures There are several informal measures that can be used in conjunction with or in replacement of standardized assessments. The use of informal measures for EL children can help SLPs gain fuller understanding of the language skills of this complex and heterogeneous group (Friberg, 2010). Research has been conducted regarding the diagnostic accuracy of various informal measures by comparing their results to results derived from evidence-based standardized procedures. These informal measures include 1) language sampling, 2) nonword repetition (NWR), 3) dynamic assessment (DA), 4) narrative assessment, and 5) in-depth parent questionnaires. In addition to these methods, well-trained interpreters are recommended when the SLP does not speak the child's home language. Language sampling As demonstrated by the results of the survey administered by Ceasar and Kohler (2007), language sampling is the most common form of informal or criterion-based assessments used by SLPs for the EL children on their caseloads. This is a critical component of the assessment process, because it exposes the communication forms, functions, and structures used by the child in a more naturalistic environment than standardized testing (Paul & Norbury, 2012). It is important that SLPs collect samples in a child's home language as well as in English. Of the 33% of SLPs using language 12 # ## sampling in their assessment procedures in the aforementioned study, 68% of them collected samples exclusively in English (Caesar & Kohler, 2007). The diagnostic accuracy of this procedure would be greatly improved with inclusion of an interpreter to help in collecting a language sample in the child's home language. Guitiérrez-Clellen and Simon-Cereijido (2009) created a protocol for obtaining spontaneous language samples for both English- and-Spanish speaking children. These authors recommended eliciting narratives from children in both English and Spanish by using wordless picture books. The Systematic Analysis of Language Transcripts (SALT; Miller & Iglesias, 2012) software is available in both languages and can be used by SLPs (or interpreters) to accurately analyze language production in both languages. In terms of linguistic development, research on monolingual English-speaking children has shown that particular morphological markers exist that are predictive of language impairments. These are regular past tense -ed, present third person singular -s, as well as copula and auxiliary forms of "is," "are," and "am." Similarly, Spanish presents with particular verb morphology that is typically problematic for children with language impairments. Gutiérrez-Clellen and Simon-Cereijido (2009) developed guidelines presenting typical language markers SLPs can use to assist in their analysis of Spanish samples. These markers include clitic pronouns (e.g., "La niña le dio la conejo a el" [The girl gave the bunny to him]), verbs (e.g., "Quiero que el niño haga su tarea" [I want the boy to do his homework]), and articles (e.g., "el perro come un hueso" [The dog eats a bone]). Additionally, the authors provided a rubric for assessing narrative productions in both languages based on "One Frog Too Many" (Mayer, 1975) and "Frog on His Own" (Mayer, 1973). They provide information on how to score correct/incorrect 13 # ## productions. By implementing this practice, SLPs can become more knowledgeable about typical versus atypical production of microstructure elements in Spanish story retell. In addition, SLPs may become more adept at collaborating with interpreters to conduct assessments, which result in appropriate eligibility determination for the EL children. Nonword repetition NWR examines a child's fast mapping and word learning abilities by measuring repetition of nonsense words (Kan & Windsor, 2010). This skill has been shown to be an effective measure of children's phonological short-term memory skills. When used during an assessment, NWR, in a sense, controls for previous exposure of language experiences (Kan & Windsor, 2010). It has been reported that NWR tasks are a valid method to differentiate typical language skills from atypical skills. For example, results from a meta-analysis conducted by Kan and Windsor (2010, as cited by Kraemer et al., 2013) revealed that children with language impairment struggle significantly on NWR tasks when compared to their typically developing age-matched peers. Based on our knowledge of language development in EL children, if language impairment is present in the home language, it will also manifest in other languages being learned. Language impairment is an overall deficit in the language learning system unrelated to the process of acquiring more than one language (Langdon & Cheng, 2002). With that said, the aforementioned study, though not representative of EL children, is a good indication of how children with language impairment who speak more than one language may perform on a NWR task. Additionally, language experienced biases are 14 # ## minimized in NWR assessment and will not penalize EL children for low vocabulary knowledge. Girbau and Schwartz (2008) examined phonological working memory in sequential English learning Spanish-English bilingual children using Spanish phonotactic NWR tasks. In this study, 11 bilingual participants who had been previously diagnosed with language impairments were compared to 11 age-matched typical Spanish-English bilingual children. Twenty nonsense words were administered to the participants and the number of correct productions was compared to their scores on standardized language assessments in both English and Spanish. Results showed significantly lower correct productions from the language-impaired group. Perhaps the most interesting finding was the presence of significant correlations between NWR tasks and language assessment scores among the participants within each group. These results suggest that Spanish NWR tasks are an accurate diagnostic indicator but often require a bilingual SLP to administer. Dynamic assessment DA is a model based on Vygotsky's concept of the zone of proximal development (ZPD; as cited in Kapantzoglou, Restrepo, & Thompson, 2012, p. 2). The ZPD is not the child's current level of knowledge but rather the level of knowledge that child has the potential to achieve with adult support. The intention of this type of assessment is to minimize the influence of previous language experiences and measure the child's use of the metacognitive processes that allow him/her to learn new concepts (Kapantzoglou, Restrepo, & Thompson, 2012). The test-teach-retest paradigm in particular has been 15 # ## researched and shown to be a good form of assessing an EL child's ability to learn and can effectively shed light on whether a child has a language disorder or difference (Gutierrez-Cellen & Peña, 2001). Kapantzoglou, Restrepo, and Thompson (2012) investigated the level of sensitivity and specificity of DA when distinguishing typical development from primary language impairment in Spanish-English bilingual preschoolers. The authors used a test-teach- retest model to measure the children's ability to learn new words. The children were taught three nonsense words by relating them to three corresponding unfamiliar objects. Word/object associations were taught through discussion of semantic relationships and functionality. The participants' ability to identify and produce the words after a predetermined number of exposures were tested and recorded. An important component of DA, as noted by Kapantzoglou, Restrepo, and Thompson (2012), is a Modifiability Score, which can be recorded using either the Learning Strategies Checklist (LSC; Lidz, 1991) or the Modifiability Scale (MS; Lidz, 1987, 1991, as cited by Kapantzoglou, Restrepo, & Thompson, 2012). The LSC and the MS are two tools SLPs can use to measure the child's level of attention and self-regulation during the learning process as well as the level of adult support required during the process. It was discovered that using the DA method in situations where nine exposures to the novel word were provided combined with a measurement of the child's ability to identify the word and the LSC resulted in 80% identification accuracy. 16 # ## Narrative assessment A third informal measure used to assess language involves assessing a child's production of story macrostructure (story grammar elements). Pearson (2002, as cited by Squires, Lugo-Neris, Peña, Bedore, Bohman, & Gillam, 2014) revealed that macrostructure story elements are produced similarly during story retell tasks in Spanish and English. Squires and colleagues (2014) compared the narrative development of Spanish-English bilingual kindergarteners with language impairment to a group of age-matched typically developing peers for 2 years. The authors used Mayer's wordless picture books "One Frog Too Many" (Mayer, 1975) and "Frog on His Own" (Mayer, 1973) to elicit narratives from the children following a clinician model. They discovered that although both groups of children made improvements over the year, the children with language impairment did not perform at the level of those in the typically developing group in terms of story grammar elements. An interesting finding was that the scores the children with language impairment received on their Spanish macrostructure elements in their kindergarten year was predictive of the scores they received on their English macrostructure elements in the first grade. This was not the case for the microstructure elements such as syntactic structure and vocabulary. These data provide further evidence that the macrostructure elements of narratives are conceptually related in both languages and less influenced by language exposure/experience, making it an effective assessment tool for EL children suspected of language impairment. When utilizing wordless picture books, it is also important to obtain narrative productions in all languages spoken by the child (Kraemer et al., 2013); Squires et al., 2014) when being assessed for suspected language impairment. When the SLP does not 17 # ## speak the child's home language, the use of an interpreter is recommended. In terms of analysis, recorded narratives in English and Spanish can be evaluated for Mean Length of utterance (MLU) using the English and Spanish versions of SALT (Miller & Iglesias, 2012). Additional analysis of a child's inclusion of macrostructure details can be performed through use of the Monitoring Indicators of Scholarly Language (MISL; Gillam & Gillam, 2013). Parent questionnaires In addition to gathering data via formal and informal assessments,#gathering data from a child's parent(s) or caregivers is necessary and valuable. The use of a parent questionnaire by SLPs can result in valuable insight into a child's language development and language experience within their home environment. In addition, the information gathered via parent questionnaires has been shown to have moderate accuracy in prediction, with higher specificity (96%) than sensitivity (66%) (TD mean: 0.81 vs. LI mean: 0.50, p < 0.001). Results have shown that when using parent questionnaires, information about development of early milestones is the most powerful discriminator (d= 1.56) (Paradis, Emmerzael, & Duncan, 2010). They have also been shown to correlate with standardized test outcomes when used for EL children of varying home languages (Bedore, Peña, Joyner, & Macken, 2011). In cases when the parent has limited English proficiency and the SLP does not speak the parent's home language, SLPs must employ the services of an interpreter to assist with the interview process. Best practices in use of interpreters recommends that they present exactly what the SLP states, rather than a paraphrased version. Among the 18 # ## many parent interview forms available for use by SLPs, the following two forms are supported by research. The first form was created by The Department of Linguistics at the University of Alberta Child English as Second Language (CHESL) Resource Center. The form and its instructions are available for download on: http://www.chesl.ualberta.ca (Paradis, Emmerzael, & Sorenson Duncan, 2010). This questionnaire was designed to be nonspecific to the first language spoken by the child and can be translated into any language with the help of a well-trained interpreter. This questionnaire includes important questions regarding developmental milestones, presence of family history of language impairment, and parent education levels, which are all indicators found to be related to presence of language impairment (Stanton-Chapman, Chapman, Bainbridge, & Scott, 2002; Trauner, Wulfeck, Tallal, & Hesselink, 2000). The second form was developed by Gutiérrez-Clellen and Kreiter (2003) and is available in both English and Spanish forms on the San Diego State University website: http://slhs.sdsu.edu/. Use of interpreters The use of interpreters during the assessment process is crucial to understanding the characteristics of the child's native language, but it is also an important tool when interacting with the client's family members. Guidelines in IDEIA (2004) clearly state that parents of children with disabilities, who themselves have limited English proficiency, must be provided with information in their home language in order to have the capability to actively participate in their child's meeting. As such, interpreters can help parents understand the assessment and intervention process as well as making decisions on their child's goals and progress (Langdon, 2009). # 19 # ## According to the ASHA Practice Portal (http://www.asha.org/Practice- Portal/Professional-Issues/Collaborating-With-Interpreters/, 2015), it is the SLPs responsibility to find an interpreter with native/near native proficiency in the target language, who has an understanding of the cultural norms and characteristics of the child and his/her family, and can explain the assessment and intervention process in professional language. Although finding an interpreter that fits all of these expectations may be difficult, it is extremely important to the effectiveness of the assessment process and the happiness of the child's family (Garcia, Roy, Lonnie, Okada, Perkins, & Wiebe, 2004). # Although there is a paucity of data on the use of interpreters in the field of Speech-Language Pathology, there is ample data on their use in special education and health-related fields. For example, Garcia and colleagues (2004) investigated parents' satisfaction of an interpretation experience after visiting a pediatric emergency hospital. Participants (N=180) with limited English proficiency were chosen and randomly assigned to three possible translation modes. The first mode was the most experienced interpreter, who demonstrated native-level Spanish proficiency through testing, and received frequent in-hospital training regarding interpretation of medical terminology. The second mode, "ad-hoc translators," was any person in the hospital, employee or family member, who spoke Spanish but whose proficiency levels were subjective and not predetermined by language fluency testing. The third mode was a telephone interpreter who translated the encounter via speakerphone. Not surprisingly, results from this study showed parents to be significantly more satisfied with not only the interpretation, but the entire medical experience, when the more proficient, highly trained interpreters were 20 # ## employed. The aforementioned research can be easily applied to the field of Speech- Language Pathology, as it demonstrates the importance of quality interpreters to parent and family satisfaction levels when interacting with professionals. Use of interpreters provides the families of children who need special education or speech and language services with the power of inclusion and decision making during the Individualized Education Plan (IEP) process (Langdon, 2009; More, Hart, & Cheatham, 2013).# Regarding the use of an interpreter during the language assessment, Langdon (2002) introduced a three-step process helpful to SLPs working with interpreters known as "BID: briefing, interaction, and debriefing." During these three steps, the SLP and the interpreter thoroughly review the information provided about the child, plan for the interaction, work together to extract all pertinent information, and have a follow-up discussion surrounding observations and results gathered from the interaction. Additionally, Langdon (2002) recommends that data be qualitatively collected by the interpreter during a conversation with the child as opposed to using standardized tests directly translated from English. Following these guidelines when using interpreters will assist in providing more effective and efficient assessment methods for EL children (Langdon, 2002). Factors influencing SLP assessment practices The identification of language impairments in EL children poses greater challenges for monolingual English speaking SLPs than for SLPs who speak a child's home language. Considering 95% of SLPs are monolingual English speakers (ASHA, 21 # ## 2014; www.asha.org/uploadedFiles/Demographic-Profile-Bilingual-Spanish-Service- Members.pdf ), many rely on English-only standardized measures when assessing EL children (Caesar & Kohler, 2007; Kraemer & Fabiano-Smith, 2015). Aside from navigating the verbiage of IDEIA, USOE, and ASHA, SLP's assessment practices may be influenced by the following factors: 1) The lack of norm-referenced standardized bilingual language assessments, 2) the lack of training in nonbiased assessment practices in university training programs, 3) the lack of expertise in adequately utilizing interpreters during parent interview and testing, and 4) operating under the assumption that standardized assessment tools must be administered in order to comply with district and state policy (Caesar & Kohler, 2007; Kraemer et al., 2013; Paradis, Schneider, & Duncan, 2013). These data suggest that several factors may influence how SLPs assess for language impairment in EL children. What has not been studied is whether these factors are present in the assessment practices of school-based SLPs working in northern Utah. In order to make this determination, this study will 1) establish whether school-based SLPs working in northern Utah adhered to IDEIA, USOE, and ASHA policy and guidelines during their assessments of EL children and 2) gather information via survey about SLPs' assessment practices, training, and their confidence in assessing children suspected of language impairment. The two aims are broken into the following five research questions: 1. Did school-based SLPs in northern Utah adhere to federal, state, and professional guidelines during initial assessments of EL children? 2. Which tests and measures were used by SLPs to determine language impairment in EL children? 22 # ## 3. What are the primary factors that may be associated with SLPs assessment practices (e.g., years of experience, training in the assessment of EL children, availability of and experience with using interpreters)? 4. What are the primary factors that may be associated with SLPs' workplace (e.g., number of students on caseload, district mandates) that may influence their assessment practices? 5. What are SLPs' overall confidence levels in conducting language assessments for EL and monolingual English speaking children? Although much of the aforementioned research has focused on the assessment of Latino EL children, laws and policies guiding SLPs' assessment practices apply to all EL children regardless of first language. As such, this study will not limit itself to investigating the assessment of Latino EL children but encompass all ethnic groups and home languages. Predictions In addressing research questions 1 and 2, it is predicted that, as a group, monolingual English speaking SLPs will present with low levels of compliance with federal, state, and professional guidelines when assessing EL children. Particular low compliance practices will include inconsistency in 1) assessing EL children in their home language, 2) including interpreters during the assessment process, 3) conducting in-depth parent interviews with special attention to home and English language development, and 4) incorporating informal measures during the assessment process. In addition, it is predicted that monolingual English speaking SLPs will rely primarily on formal assessment results (standard scores) as the basis for determining the presence of a language impairment. It is predicted that bilingual SLPs will present with higher levels of compliance with the abovementioned guidelines and practices as well as a lower reliance 23 # ## on formal assessment results as the sole diagnostic indicator of language impairment. Addressing research questions 3 through 5, it is predicted that when responding to our survey, both monolingual English-speaking and bilingual SLPs will report feeling pressured to administer standardized tests and caseload size will be related to type of assessment used. With regards to SLP confidence levels, it is predicted that monolingual English-speaking SLPs will report higher confidence levels in assessment of monolingual English-speaking children than when assessing EL children and bilingual SLPs will report higher levels of confidence in assessment of EL children. SLPs who report more professional training in the realm of EL assessment practices will report higher levels of confidence. # # ## METHODS File review This study took place at several schools in northern Utah. In answering study questions 1 (Did school-based SLPs in northern Utah adhere to federal, state, and professional guidelines during initial assessments of EL children) and 2 (Which tests and measures were used by SLPs to determine language impairment in EL children?), I conducted an in-depth file review of all available initial speech-language assessment reports along with available supporting documentation. In order to access reports, I acquired permission to conduct research from district administration and school boards. Once permission was granted, I obtained a list of EL children identified as receiving speech-language services from the director of special education, lead SLP, and/or school principal. Files were randomly selected by election of every fifth file. Information obtained from the file review was each child's grade level, names of tests used, test scores, informal measures, case history, parent report information, teacher comments, and documentation of interpreter participation. Files of children receiving services for speech sound disorder, fluency, and/or voice impairments were not reviewed. To systematically address questions 1 and 2 (as well as to provide a framework for data analysis), items adapted from Figueroa and Newsome (2006) were used to assess each file. These questions can be seen in Appendix A. 25 # ## Analysis of file review data was descriptive and presented in frequency distribution tables. # SLP survey To answer research questions 3 (What are the primary factors that are associated with SLP's assessment practices, e.g., years of experience, training in the assessment of EL children, use of interpreters), 4 (What are the primary factors associated with SLPs' workplace, e.g., number of students on caseload, district mandates, that may influence their assessment practices), and 5 (What are SLPs' overall confidence levels in conducting language assessments for EL and monolingual English speaking children?) a 25-item survey was administered to SLPs from districts whose files were reviewed. The survey was based on the school-based intervention decision-making model (SIDM) of Brandel and Loeb (2011). In creating this model, Brandel and Loeb surveyed SLPs (N=1,897) on factors influencing their treatment decision making. The SIDM consisted of three domains: (1) Student domain, which included specific strengths and needs the children receiving speech and language services; (2) Workplace domain, which included caseload size and administrative support; and (3) SLP domain, which involved clinical training, types of experiences, and years of experience. Only the SLP and Workplace domains were used for this study as the Student domain pertains to treatment decisions, which were not surveyed. The 25-item survey's SLP domain included items asking SLPs for their years of experience, training/coursework exposure to assessment of EL children, confidence levels surrounding assessment in general, specific procedures included in the assessment 26 # ## process, languages spoken, and employment of interpreters. The Workplace domain included items asking SLPs about their caseload size, demographics, state/district mandates, and pressure to use standardized tools. The survey was administered to consenting SLPs during their team meetings or on a one-on-one basis. The semantic continuum of options for each item were adapted from the National Institute of Health Patient Reported Outcomes Measurement Information System (PROMIS) sample survey questions for self-reported assessment of health status (http://www. nihpromis. org/ measures/Sample Questions). The survey, including SLP consent, is presented in Appendix B. In terms of survey items 1, 3, 4, 8, 9, 14, 15-23, they addressed factors in the SLP domain. Survey items 2, 5, 6, 7, 10-13, and 24 addressed factors in the Workplace domain. Analysis of survey results was performed using descriptive statistical analysis and are presented in distribution tables. # RESULTS The present study had two specific aims: (1) to gather speech-language assessment data (i.e., formal tests, informal measures) for EL children who had been assessed for language impairment and (2) to gather information via survey about SLPs' assessment practices, training, and their confidence in assessing children suspected of language impairment. The five research questions addressing the specific aims are as follows: 1) Did school-based SLPs in northern Utah adhere to federal, state, and professional guidelines during initial assessments of EL children? 2) Which tests and measures were used by SLPs to determine language impairment in EL children? These questions were answered by conducting an in-depth systematic review of 63 EL student files. Files consisted of speech-language assessment reports and any available supporting documentation (i.e., report cards, medical reports, teacher notes, etc.). It was expected that many languages would be represented in the file review, but after reviewing 63 files, Spanish was the only second language documented. As such, the use of EL in the remainder of this document refers to Spanish English-speaking children. Grade levels of children represented in the file review ranged from preschool-aged to ninth grade with one 11th grade student. The majority of files reviewed were of children in grades 2 through 5. Research questions 3 through 5 address the second specific aim of # # # # # 28# the study and are as follows: 3) What are the primary factors that may be associated with SLPs assessment practices (e.g., years of experience, training in the assessment of EL children, availability of and experience with using interpreters)? 4) What are the primary factors that may be associated with SLPs' workplace (e.g., number of students on caseload, district mandates) that may influence their assessment practices? 5) What are SLPs overall confidence levels in conducting language assessments for EL and monolingual English speaking children? These questions were addressed by way of the completion of a 25-tem survey. The survey, administered to 35 SLPs, surveyed factors associated with specific SLP assessment practices and workplace aspects associated with assessment decisions. The following sections present the findings of each research question. Question 1 Did school-based SLPs in northern Utah adhere to federal, state, and professional guidelines during initial assessments of EL children? It was predicted that as a group, monolingual SLPs would present with low levels of compliance in adhering to federal, state, and professional guidelines when assessing EL children. It was also predicted that low compliance practices would include inconsistent (1) assessment of EL children in their home language, (2) use of interpreters during the assessment process, (3) collection of in-depth parent interviews, and (4) incorporation of informal measures during the assessment process. "Low compliance" was considered to be when files revealed that as a group, SLPs in a particular district presented with fewer instances of inclusion of the aforementioned assessment procedures necessary for adherence to mandates. Contrary to the prediction that monolingual SLPs would present with low # # # # # 29# compliance practices characterized by inconsistent assessment of children in their native language and use of interpreters, the results from the file review do not indicate this. The majority of EL students, 50 out of 63 (79%), were assessed in their native language (Spanish) and 51 out of 63 (81%) assessments included use of an interpreter for some aspect of the assessment process (e.g., translation of formal test, language sample, or parent interview). Consistent with the prediction that monolingual SLPs will present with low levels of compliance in collection of in-depth parent interviews and incorporation of informal measures to supplement information gathered by formal measures, inconsistent use of these strategies was evidenced in the files. With regard to use of parent interviews, 9 out of 63 (14%) of files reviewed included data from parents about the development of their child's language(s). Additionally, 51 out of 63 (81%) files documented the use of formal tests (Spanish and/or English) as the only form of assessment for eligibility determination. In each of the participating districts, bilingual Spanish-English-speaking SLPs conducted all of the assessments for the Spanish-speaking EL children. Interestingly, data to support the prediction that bilingual Spanish-English-speaking SLPs would adhere to federal, state, and professional guidelines when assessing EL children was not supported. Spanish- English speaking bilingual SLPs employed formal assessments almost exclusively when assessing Spanish-English EL students. Table 1 presents the answers to the 15 assessment questions used to guide the review of files. For ease of analysis, data from the three districts were combined. # # # # # 30# Table 1. Results from review of speech-language assessment reports. ________________________________________________________________________ Item Yes No N/A N.S 1. Was standardized testing the only form 51 12 of assessment? 2. Was the student tested in his or her native 50 13 language? 3. If yes, did the test given have good 0 50 psychometric properties and represent the student within the norming sample? 4. If test was not psychometrically sound, 1 62 was there a disclaimer regarding its validity? 5. Is there any discussion of the student's language 22 41 dominance and English proficiency? 6. Is there discussion of time spent in the 19 44 United States/ time of exposure to English language? 7. Is there mention of the parental or caregiver 32 31 information/ primary language spoken at home? 8. Was there an interpreter used during the assessment? 51 10 2 9. If yes, was the interpreter familiar with 53 the cultural and linguistic variations of the language? 10. Did the assessment involve any analysis 10 27 19 of the child's schoolwork? 11. Did the assessment include a "disclaimer" 3 41 19 regarding the use of monolingual assessment tools? 12. Were informal assessment measures used? 13 50 (e.g., language sampling, DA, NWR, story retell). 13. Did the assessment include information regarding 9 54 The child's development of their native language # # # # # 31# Table 1 continued. ____________________________________________________________________ Item Yes No N/A N.S Compared to other children? 14. Did the assessment include information regarding 10 53 Presence of family history of speech or language Impairment? 15. Did the assessment include information about 19 44 The child's acquisition of typical developmental Milestones? ________________________________________________________________________ Legend. N/A=Not Applicable; N.S=Not Specified # # # # # 32# Question 2 Which tests and measures were used by SLPs to determine language impairment in EL children? Table 2 represents the tests frequently used by SLPs during their initial assessment of EL students. Although formal Spanish versions of English tests are available, they have flawed psychometric properties (e.g., a monolingual-Spanish speaking norming sample or a norming sample that includes children with impairments) (Gutiérrez-Clellen, & Simon- Cereijido, 2009; Langdon, & Cheng, 2002). As such, the use of these tests for Spanish- English EL children may lead to invalid results. In addition to these ‘psychometric' concerns using standardized tests on EL students is the reliance on single-word picture vocabulary naming tests (e.g., EVT-2, PPVT-4, and EOWVT-Bilingual) to determine LI. Paradis et al. (2013) have demonstrated that scores on English single-word picture vocabulary naming tests do not necessarily result in scores that differentiate typically developing EL children from their language-impaired EL peers. As such, this evidence demonstrates that the presence of "typical" vocabulary skills may not provide enough evidence to rule out LI the way that other linguistic subdomains would (i.e., story grammar, morphology, and verbal memory). Answers to research question 2 relate to the above-mentioned prediction regarding SLPs' minimal use of informal measures to supplement the formal data gathered in their assessments. Thirteen out of 63 (21%) files contained data from informal measures. Of these 13, 11 (85%) used language sampling. These findings are consistent with research conducted by Caesar and Kohler (2007) who demonstrated that language sampling is the primary form of informal assessment frequently used by SLPs # # # # # 33# Table 2. Commonly used tests extracted from file review of speech-language assessment reports- ordered from most frequently used to least frequently used. _______________________________________________________________________ Tests used Frequency ________________________________________________________________________ Peabody Picture Vocabulary Test - 4th Ed. 32 Expressive Vocabulary Test - 2nd Ed. 23 Expressive One-Word Picture Vocabulary Test: Bilingual 19 Clinical Evaluation of Language Fundamentals -4th Ed. 18 Preschool Language Scales: Spanish- 4th Ed. 18 Test de Vocabulario en Imagenes Peabody 13 Clinical Evaluation of Language Fundamentals -Spanish - 4th Ed. 12 Receptive One-Word Picture Vocabulary Test: Bilingual 11 Expressive Vocabulary Test 6 Comprehensive Assessment of Spoken Language 5 Preschool Language Scales: Spanish- 5th Ed. 5 Preschool Language Scales - 4th Ed. 4 Test of Adolescent and Adult Language - 4th Ed. 3 The Listening Comprehensive Test - Adolescent 3 Clinical Evaluation of Language Fundamentals - 5th Ed. 2 Clinical Evaluation of Language Fundamentals: Preschool -2nd Ed. 2 Preschool Language Scales -5th Ed. 2 Expressive One-Word Picture Vocabulary Test 2 Test of Oral Language Development - Primary - 4th Ed. 2 Clinical Evaluation of Language Fundamentals: Preschool -Spanish -2nd Ed. 1 # # # # # 34# Table 2 continued. ________________________________________________________________________ Tests used Frequency ________________________________________________________________________ Oral and Written Language Scales -2nd Ed. 1 Test of Oral Language Development - Intermediate - 4th Ed. 1 Peabody Picture Vocabulary Test - 3rd Ed. 1 ________________________________________________________________________ # # # # # 35# when assessing EL students. The remaining two assessment reports with documented informal measures used a story retell task to glean more information about the child being assessed. Question 3 What are the primary factors associated with SLP's assessment practices (e.g., years of experience, training in the assessment of EL children, availability and experience with using interpreters)? Tables 3 through 7 present the responses of SLPs working in three districts ([D1 (N=7), D2 (N=9), and D3 (N=19)]) to items regarding the use of informal assessment measures. It was hypothesized that the SLPs who reported having graduate training and/or other professional training in the realm of EL assessment practices would report higher confidence levels and report using more informal measures highlighted in the literature review section of this paper when assessing EL children. SLPs were considered to have higher confidence levels when responding "quite a bit" or "very much" to survey questions. Table 3 presents the type of training SLPs participated in regarding assessing EL children. Table 4 presents SLPs reported inclusion of nonword repetition tasks in their assessments of EL and monolingual English-speaking children. Table 5 presents SLPs' reported inclusion of information regarding achievement of typical developmental milestones during their assessment of EL and monolingual English-speaking children. Table 6 presents SLPs reported inclusion of information about presence of family history of language impairment during their assessments of EL and monolingual English-speaking children. Table 7 presents SLPs' perceived qualification to conduct # # # # # 36# Table 3. Survey responses for participation in education about assessment of EL children. ________________________________________________________________________ District Type of training Yes No ________________________________________________________________________ D1 Graduate coursework 4 3 Postgraduate training 5 1 D2 Graduate coursework 6 3 Postgraduate training 7 2 D3 Graduate coursework 16 3 Postgraduate training 14 2 ________________________________________________________________________ Table 4. Survey responses for inclusion of NWR tasks in the assessment process. ________________________________________________________________________ District Language of student Yes No ________________________________________________________________________ D1 Monolingual English 2 5 English learning 2 5 D2 Monolingual English 8 1 English learning 7 1 D3 Monolingual English 2 17 English learning 3 16 ______________________________________________________________________ *NWR=Nonword repetition # # # # # 37# Table 5. Survey responses for inclusion of information about typical developmental milestones ________________________________________________________________________ District Language of student Never Sometimes Usually Always ________________________________________________________________________ D1 Monolingual English 0 4 1 2 English learning 0 3 2 2 D2 Monolingual English 0 2 3 4 English learning 0 2 2 5 D3 Monolingual English 0 0 2 17 English learning 0 0 2 17 ________________________________________________________________________ Table 6. Survey responses for inclusion of information about presence of family history of language impairment ________________________________________________________________________ District Language of student Never Sometimes Usually Always ________________________________________________________________________ D1 Monolingual English 0 4 1 2 English learning 0 4 1 2 D2 Monolingual English 0 4 3 2 English learning 0 3 3 3 D3 Monolingual English 0 2 3 14 English learning 1 1 4 13 _________________________________________________________________________________________________####### # # # # # # 38# _T_a_b_le_ _7_. _S_u_rv_e_y_ _re_s_p_o_n_s_e_s _f_o_r _p_e_rc_e_i_v_e_d_ l_e_v_el_ o_f_ _q_u_a_li_fi_c_a_ti_o_n_ a_s_s_e_ss_i_n_g_ E_L__ c_h_i_ld_r_e_n_ ____ District Not at all Somewhat Quite a bit Very much ________________________________________________________________________ D1 0 4 2 1 D2 0 3 0 6 D3 0 1 6 11 __________________________________________________________________ _____ Note: Perceived level of qualification was compared to SLP's first year working in the schools assessments on EL children compared to their first year working in the schools. As seen in Table 3, D3 had the highest percentage of SLPs reporting having had graduate as well as additional professional education regarding assessment methods for EL children. This particular school district also reported the highest numbers of "always" including information about developmental milestones and family history of language impairment in their assessments. It is important to note, however, assessment practices such as inclusion of NWR tasks, developmental milestones, and family history seem to be uniform regardless of first language status. This may indicate that their models for assessment of EL children are more associated with the methods employed for the monolingual children on their caseloads than graduate or professional education on the subject. The SLPs were also surveyed to report on the extent they felt qualified assessing EL children at the present time in comparison to their 1st year post clinical fellow (CF). The SLPs in D3 reported the highest percentage of participation in graduate coursework and postgraduate education. They also reported the highest perceived level of qualification conducting EL assessments compared to the other districts. # # # # # 39# Question 4 What are the primary factors associated with SLPs' workplace (e.g., number of children on caseload and district mandates) that may influence their assessment practices? Table 8 presents the distribution of responses regarding pressures to use standardized tests during assessments. It was hypothesized that all SLPs would report feeling pressured to use formal tests in their assessments of all children (monolingual English-speaking or EL). As predicted, the majority of the SLPs reported feeling either "quite a bit" or "very much" pressured to use formal tests, regardless of the language status or the child. What was not predicted, however, was that SLPs would report feeling greater pressure to use formal tests with monolingual English-speaking children rather than EL children. Pressure to use standardized assessments may be a factor associated with the SLPs' workplace influencing their assessment practices. Table 9 presents survey responses regarding whether or not districts have mandates about specific formal tools to be used. The majority of SLPs reported not having mandates from their district regarding specific formal tools to use when assessing EL or monolingual English-speaking children. Some inconsistencies were seen between individual SLPs within each district (particularly in D2) regarding whether or not those mandates exist for EL children. Table 10 presents the average caseload size based on survey responses provided by SLPs. Caseload size is another workplace domain that was predicted to be related to the types of assessments SLPs chose to employ. Results surrounding this prediction are similar to those previously discussed regarding SLP training on appropriate assessment methods for # # # # # 40# Table 8. Pressures to use standardized tools. ________________________________________________________________________ District Language of student Not at all Somewhat Quite a bit Very much ________________________________________________________________________ D1 Monolingual English 0 4 1 2 English learning 0 5 1 1 D2 Monolingual English 2 1 3 3 English learning 1 1 4 3 D3 Monolingual English 1 2 6 9 English learning 1 2 8 6 _________________________________________________________________________________________________### Table 9. District mandates regarding use of standardized tools.# ________________________________________________________________________ District Language of student Yes No ________________________________________________________________________ D1 Monolingual English 1 6 English learning 1 6 D2 Monolingual English 0 8 English learning 5 4 D3 Monolingual English 4 15 English learning 5 13 _________________________________________________________________________________________________######! # # # # # 41# EL children. Once again, D3 had the lowest average caseload size and the highest number of reported "always" including information about developmental milestone acquisition and family history of language impairment. However, the SLPs in all districts appear to be implementing the same strategies for the EL children on their caseload as the monolingual children. Question 5 What are SLPs' overall confidence levels in conducting language assessments for EL and monolingual English speaking children? Table 11 presents the SLPs' overall confidence level in their ability to conduct language assessments of both monolingual and EL children. It was predicted that Monolingual English-speaking SLPs would report having "quite a bit" or "very much" more confidence conducting assessments on monolingual English-speaking children than EL children. In addition, bilingual SLPs will report "quite a bit" or "very much" more confidence in assessing EL children. For ease of data interpretation, survey responses including "quite a bit" and "very much" were considered to be higher levels of confidence, and "not at all" and "somewhat" responses were considered to be lower levels of confidence. Tables 12 and 13 present a comparison between monolingual English-speaking SLPs and bilingual SLPs' confidence levels when assessing monolingual English-speaking children versus English learning children. Table 14 presents survey responses from SLPs regarding their perception of whether or not EL students in their districts are being over-identified as having language impairments, under-identified, or identified at # # # # # 42# Table 10. Average caseload size per district. ________________________________________________________________________ District Average Caseload size ________________________________________________________________________ D1 57 D2 100 D3 47 _________________________________________________________________________________________________# Table 11. Overall confidence levels in conducting assessments. ________________________________________________________________________ Language of student Not at all Somewhat Quite a bit Very much ________________________________________________________________________ Monolingual English 0 2 7 25 E n g l i s h l e a r n i n g 0 10 13 11 ________________________________________________________________________ As a group, the SLPs in this study reported feeling higher confidence levels in their ability to appropriately assess monolingual English-speaking children than EL children. It was hypothesized that monolingual English-speaking SLPs would report higher confidence levels with assessing monolingual-English speaking children, and bilingual SLPs would report higher confidence levels when assessing EL children. Data presented in Table 12 suggest that monolingual English-speaking SLPs, in fact, have higher confidence in conducting assessments for monolingual English-speaking children than for EL children. Contrary to the prediction, the bilingual SLPs (represented in Table 13) more frequently reported "very much" confidence in their assessment of the same rate. # # # # # 43# Table 12. Monolingual SLPs' confidence levels in conducting assessments. ________________________________________________________________________ Language of student Not at all Somewhat Quite a bit Very much ________________________________________________________________________ Monolingual English 0 1 6 21 English learning 0 8 11 9 ________________________________________________________________________ Table 13. Bilingual SLPs' confidence levels in conducting assessments. ________________________________________________________________________ Language of student Not at all Somewhat Quite a bit Very much ________________________________________________________________________ Monolingual English 0 1 1 4 English learning 0 2 2 2 ________________________________________________________________________ Table 14. SLPs' opinion regarding misidentification of EL children for language impairment. ________________________________________________________________________ District Over-identified Under-identified Identified at same rate ________________________________________________________________________ D1 2 2 3 D2 4 2 3 D3 4 2 13 ________________________________________________________________________ # # # # # 44# monolingual English-speaking children than EL children. Data from Table 14 suggest that, overall, 45.7% of SLPs believed EL children are being either under- or over-identified and 54.3% of them reported believing EL children were being identified at the same rate as their monolingual English-speaking peers. This information is relevant to the proposed research question about confidence levels because it would be expected that an overall lack of confidence in the assessment process of EL children would lead to expected misidentification of those children being assessed. These numbers show the opposite, with more SLPs reporting EL children to be identified at the same rate as their monolingual peers. # # # # # DISCUSSION The specific aims of this study were two-fold: (1) to gather speech-language assessment data (i.e., formal tests, informal measures, and any supporting documentation) for EL children who had been assessed for language impairment and (2) gather information via survey about SLPs' assessment practices, training, and their confidence in assessing children, regardless of language status, suspected of language impairment. Results from the first aim clearly demonstrate low levels of compliance with federal, state, and professional guidelines, particularly in the use of in-depth parent interviews and informal assessment measures to supplement results from formal testing. In addition, the majority of SLPs used interpreters and Spanish standardized tests, but included little to no documentation addressing the potential psychometric issues using these tests (i.e., inappropriate norming sample). These findings support previous findings (Kraemer & Cho, in review, 2016; Kraemer & Fabiano-Smith, in review, 2016; Langdon & Cheng, 2002), showing that SLPs working in northern California also relied heavily (if not exclusively) on standardized assessment tools to base their diagnostic decisions on EL Latino children. The finding that the majority of SLPs assessed EL children in the child's native language differs from previous findings (Kraemer & Cho, in review, 2016; Kraemer & Fabiano-Smith, in review, 2016). These findings, while positive in that consideration # # # # # 46# of a child's home language is vital, reveal remaining concerns of a diagnosis on standardized tools with inadequate psychometrics. As stated previously, for SLPs to comply with federal, state, and professional guidelines, assessments must "yield accurate information on what the child knows" (IDEIA, 2014) and be "acquired through high quality valid and reliable assessments" (USOE, 2010). The sole use of standardized assessments without appropriate psychometric properties will not yield representative/accurate results for EL children. It is important for SLPs to incorporate other assessment practices such as parent interviews to supplement standardized tests scores for a more comprehensive assessment. The second aim of the study revealed varying results regarding the association between two domains (SLP and Workplace) and SLPs' assessment choices and perceived level of qualification/confidence in conducting EL assessments. With respect to the SLP domain, results indicated that graduate or other clinical training on the subject of EL assessment may not be a factor associated with assessment practices and perceived level of qualification. Rather, SLPs who have received graduate or clinical training on the topic may have difficulty with clinical application due to a lack of hands-on experience conducting these assessments and synthesizing their results to make diagnostic decisions. In terms of the Workplace domain, caseload size and district mandates directing SLPs to use specific standardized tests may pose a greater influence on SLPs' assessment process, regardless of a child's language status. As such, factors within the workplace domain may influence SLPs assessment decisions more so than training. This phenomenon along with other interesting survey results will be discussed in the following sections. # # # # # 47# Low levels of compliance with inclusion of informal assessment procedures and parent information As stated, results from the file review demonstrated a heavy reliance on the use of standardized assessments for speech-language eligibility. This finding closely reflects the survey responses in that the majority of SLPs reported feeling pressured to use standardized tests (English-only as well as Spanish-English versions). Interestingly, there was variation in responses from SLPs about whether or not their district requires specific tests. It would be expected that all SLPs within a district would respond to this question similarly, but that was not the case. As seen in the results, five of nine SLPs in D2 reported there being mandates in their district regarding use of specific tests when assessing children for LI, and four out of nine SLPs in the same district reported there being no such mandate. This may indicate that district policies regarding assessment practices are not well communicated, resulting in uncertainty in the assessment process. It is also possible that SLPs responded that districts do mandate certain tests but conduct assessment otherwise. An interesting finding came out of the file review when investigating the frequency of specific formal tests used during assessment of EL children. The test that was used the most frequently (32 of 63 assessment reports), the Peabody Picture Vocabulary Test- Fourth Edition (PPVT-4), is also the test that has been found to be an insensitive diagnostic indicator of language impairment in EL children (Paradis et al. 2013). Research on the use of this test indicated that typically developing EL children tend to score similarly to their language-impaired peers. It is possible that the SLPs using this test have not been exposed to this particular research, which demonstrates the # # # # # 48# importance of maintaining recent, evidence-based practice methods to create the most efficient and effective assessment practices. It was also expected that SLPs would respond that they seldom used informal assessment measures such as NWR as none were documented in the files. Even though 88% (N=9) of SLPs in D2 reported using NWR tasks during their assessment of EL children, there was no evidence of NWR tasks documented in the files. It is possible, but not likely, that SLPs reporting the use of NWR tasks failed to include the interpretation of these data in their reports. Additionally, 4 SLPs qualified their responses to the survey item regarding NWR tasks. They claimed they had a lack of understanding of how to interpret the results and an inability to access NWR tasks having supportive evidence. Interpretation of results is a crucial component in synthesizing assessment information for determination of eligibility for services. As such, use of NWR will likely not lead to appropriate identification of LI if there is a lack of understanding of the indication of the results. The absence of in-depth parent interviews in grades other than preschool is disconcerting. As documented, information regarding speech, language, and motor development acquired via parent interview is vital for SLPs to consider when determining whether the child's skills in these areas are due to typical developmental difference or disorder (Paradis, Emmerzael, & Duncan, 2010). To assure a comprehensive assessment is conducted, SLPs must document speech, language, and motor developmental milestones for all children regardless of grade level. # # # # # 49# Lack of variation in perceived level of qualification participation in graduate coursework/clinical training Out of the three districts, D3 SLPs responded with the most (N=16) graduate coursework and other clinical trainings (N=14) on the topic of assessment practices for EL children. This is interesting as the SLPs in the district reporting the least percentage of members with graduate coursework and other clinical trainings (D2) shared similar perceptions of their qualification. This result may indicate that although SLPs participate in coursework and trainings on the importance of appropriate EL assessment practices, they may lack the confidence and/or experience employing them clinically. Caseload size and district mandates It was discovered that the district with the smallest average caseload size (D3) also reported having the greatest use of informal assessment measures deemed to be diagnostically relevant (i.e., developmental milestones and family history of language delay). However, these SLPs reported using these measures equally for EL and monolingual English-speaking children. This may suggest that if caseload size indeed dictates assessment decisions, the assessment practice is similar, if not uniform, for all children being assessed regardless of their language status. It is not surprising that SLPs with the largest average caseload size (D2) used the greatest number of standardized assessments. This relationship is expected as having SLPs with large caseloads (anything over the ASHA suggested cap 52) leads to less time available during the work day for other caseload management duties such as planning, providing therapy, assessing, and completing paperwork. SLPs with large caseloads may # # # # # 50# struggle balancing these various aspects of the job and as a consequence, conduct quick, timely assessments rather than time-consuming comprehensive assessments. This notion may substantiate the finding that SLPs in D2 frequently employed five different standardized tests to a child with no evidence of informal measures. Conversely, SLPs in the districts who reported typical caseload numbers did occasionally employ informal measures such as language sampling in the child's native language. Thus, the combination of factors, such as SLPs reporting "quite a bit" or "very much" in terms of pressure by district administrators to use standardized tests and large caseload sizes, may be associated with the assessment practices discovered by the file review and survey data. Overall confidence levels The survey results yielded some interesting findings regarding SLP confidence levels when assessing both monolingual English-speaking children and their EL peers. Interestingly, confidence levels of bilingual SLPs were not higher than their monolingual English-speaking colleagues with regard to the assessment of EL children. Both groups reported being more confident assessing monolingual English-speaking children. Unless an SLP completed his/her coursework in a program offering a bilingual certificate or in another country, their academic and clinical training was more than likely geared toward the assessment and treatment of monolingual-English speaking children. As such, U.S. trained bilingual SLPs who speak a language other than English may not feel they are able to accurately assess EL children as the result of their training and as such not feel very confident about doing so. # # # # # STUDY LIMITATIONS It is important to note the various limitations of this study. First, the study was a preliminary study and as such, data were best analyzed using descriptive measures. In addition, the sample size being small limits the generalizability of the findings. Second, there was a lack of congruence among the SLPs who completed surveys and the files reviewed. That is, some of the SLPs who completed surveys may not have had any of their files reviewed. It is also likely the case that many of the children assessed may have been assessed by SLPs in other districts or SLPs who no longer work in the district. These data, if available, may have led to different findings and as such, would be a next step in continuing this line of study. Related to this limitation is the fact that there was an unequal representation of files among the three districts. D2 had a greater number of files available to review and as such, the assessment practices leading to the reported workplace factors may have been influenced by the sheer volume of assessments and large caseload sizes present in that particular district. It would be helpful to understand whether balancing the number of files reviewed in each district affects the findings. Finally, it would be beneficial to study the assessment practices with the SLPs who conducted the initial assessments. This may lead to a deeper understanding of the factors responsible for the practice, which could then lead to district-wide training for all educators and administrators. # # # # # CONCLUSION Although this preliminary study presents the assessment practices of SLPs working in three school districts, it is possible these practices are being employed elsewhere in Utah and in the U.S. If SLPs in Utah failed to adhere to federal, state, and ASHA guidelines in their assessment of EL children, it is probable that SLPs working in other states are equally engaged in methods that could potentially misdiagnose EL students. This exploratory study needs to be replicated in other districts within and outside of Utah in order to better understand the depth of the issue and its cause. # Based on the findings of this study, there seems to be a disconnection between the assessment practices reported by SLPs and the practices actually being employed by these SLPs when assessing EL children suspected of language impairments. The data gathered demonstrate that SLPs (whether bilingual or English-speaking) administer, almost exclusively, standardized tests in spite of reporting having knowledge of how to administer informal measures (e.g., language sampling). Another important component of a comprehensive assessment, parent interviews/questionnaires, was only evident in the speech-language reports of preschool-aged children. It is suspected that the workplace factor of large caseload sizes for SLPs working in grades k-12 does not allow for timely interviews and thus, they do not occur. Future research in this area may benefit from interviews with the SLPs conducting assessments to address the specific reasons behind # # # # # # inclusion of specific procedures such as parent interviews. Efforts in researching current assessment practices must coincide with the development and training of evidence-based assessment practices. The responsibility is on both researchers and district administrators to work together to provide trainings at both the district and the University levels. The trainings must also be clinically applicable and accessible to SLPs working in the schools, despite the workplace demands that appear to be affecting their practices. In addition, the outcomes of these trainings must be strongly supported and maintained. Future research efforts must address the establishment of generating evidence-based, nonbiased, ecologically valid language assessment tools best suited for all EL children. # APPENDIX A QUESTIONS GUIDING FILE REVIEW 1. Was standardized testing the only form of assessment? 2. Was the student tested in his or her native language? 3. If yes, did the test given have good psychometric properties and represent the student within the norming sample? 4. If test was not psychometrically sound, was there a disclaimer regarding its validity? 5. Is there any discussion of the student's language dominance and English proficiency? 6. Is there discussion of time spent in the United States/ time of exposure to English language? 7. Is there mention of the parental or caregiver information/ primary language spoken at home? 8. Was there an interpreter used during the assessment? # 9. If yes, was the interpreter familiar with the cultural and linguistic variations of the language? # # 10. Did the assessment involve any analysis of the child's schoolwork?# # 11. Did the assessment include a "disclaimer" regarding the use of monolingual assessment tools?# # 12. Were informal assessment measures used? (e.g., language sampling, DA, NWR, story retell).# # 13. Did the assessment include information regarding the child's development of their native language compared to other children?# # # # # # # # ## 14. Did the assessment include information regarding presence of family history of speech or language impairment?# # 15. Did the assessment include information about the child's acquisition of typical developmental milestone # # # # # # APPENDIX B SLP SURVEY QUESTIONS Thank you for taking the time to complete my survey. The purpose of this survey is to determine the current assessment practices of speech-language pathologists (SLPs) who work with English learning (EL) school-aged children. This survey will examine the following: The demographic representation of SLP's caseloads, SLP's practices when assessing language skills of EL students suspected of having a language impairment, previous training in the assessment of EL children. This survey is being conducted as part of my master thesis requirement in the department of Communication Sciences and Disorders at the University of Utah. All data collected as a result of this survey will remain the property of the University of Utah. Your participation in this survey is voluntary and you have the option of withdrawing at any time during the survey. Completion of this survey will take approximately 10-15 minutes. To ensure confidentiality, this survey does not contain any personally identifiable information. By agreeing to these terms and conditions, you acknowledge that the data you provide will be used for scholarly research purposes only and you relinquish the right to withdraw any information that you may provide upon completion of this survey. If you have any questions about the research study, please contact Robert Kraemer, Ph.D., at (801) 587-9200. This survey has been reviewed according to University of Utah's Institutional Review Board and procedures for research involving human subjects. Thank you for your time and consideration. Please check a response: ! Yes, I agree to participate ! No, I choose not to participate 1. What year did you graduate with your master's degree in speech-language pathology? ________ 2. How many years of experience do you have working as a Speech Language Pathologist in the schools? ________ # # # # 3. Number of students on your caseload: a. How many students do you currently have on your caseload? ______ b. Of those students, how many are English learning? _____ c. Of the English learning students on your caseload, how many are Spanish- English speaking? ______ 4. Besides English, do you speak any other languages fluently? ! Yes ! No If yes, what other languages do you speak? _________________ 5. Do you conduct speech-language assessments in languages other than English? ! Yes ! No If yes, which language(s)? ____________________________________________________________ _____________________________________________________________ ____________________________________________________________ _____________________________________________________________ If no, in general, which method(s) do you use - check all that apply. ! A bilingual speech-language pathologist conducts all of my non-English assessments. ! Standardized tests administered in both the student's primary language and English. ! Standardized English tests translated into the student's primary language. ! Standardized tests in the student's primary language i.e., Spanish. ! Nonstandardized measures in the student's primary language (for example, dynamic assessment, language sampling, oral narrative assessment). ! Nonstandardized measures in English (for example, dynamic assessment, Language sampling, oral narrative assessment). # # # # 6. Do you use response to intervention (RTI) strategies with the monolingual children on your caseload? ! Yes ! No If no, why? ____________________________________________________________ _____________________________________________________________ 7. Do use response to intervention (RTI) strategies with the English learning children on your caseload? ! Yes ! No If no, why? ____________________________________________________________ _____________________________________________________________ 8. Do you use nonword repetition tasks during your assessment of monolingual English speaking children? ! Yes ! No If no, why? ____________________________________________________________ _____________________________________________________________ 9. Do you use nonword repetition tasks during your assessment of EL children? ! Yes ! No If no, why? ____________________________________________________________ _____________________________________________________________ 10. Do you use an interpreter during the assessment process? ! Yes ! No If no, why? ____________________________________________________________ _____________________________________________________________ # # # # If yes, which parts of the assessment are they involved with? Check all that apply: ! Formal (standardized) testing ! Parent interview ! Development of assessment tools ! Informal (nonstandardized) testing i.e., questionnaires, language samples in primary language, narrative assessment, etc. # 11. Does your district require the use of specific assessments when evaluating monolingual children? ! Yes ! No If yes, which ones? ____________________________________________________________ _____________________________________________________________ 12. Does your district require the use of specific assessment tools when assessing EL students? ! Yes ! No If yes, which ones? ____________________________________________________________ _____________________________________________________________ 13. During times when I am assessing monolingual English-speaking children, I feel pressured to use standardized tools. ! Not at all ! Quite a bit ! Somewhat ! Very much 14. During times when I am assessing English learning children, I feel pressured to use standardized tools. ! Not at all ! Quite a bit ! Somewhat ! Very much 15. During your graduate program, did you have coursework dedicated to the assessment of English learning students? # # # # 60 ! Yes ! No If yes, was it? ! an entire course ! one or two lectures ! a section of a course 16. Have you been involved in continuing education or any in-service trainings dedicated to the assessment of English learning students? ! Yes ! No If yes, please indicate which one(s): _____________________________________________________________ _____________________________________________________________ _____________________________________________________________ 17. I feel more qualified assessing English learning students for language impairment today than my first year working in the schools. ! Not at all ! Quite a bit ! Somewhat ! Very much 18. I feel confident using my current assessment practices on the monolingual English-speaking children on my caseload. ! Not at all ! Quite a bit ! Somewhat ! Very much 19. I feel confident using my current assessment practices on the English learning kids on my caseload? ! Not at all ! Quite a bit ! Somewhat ! Very much 20. When collecting case history information for the monolingual English-speaking children on my caseload, I ask parents about typical developmental milestones. ! Never ! Usually ! Sometimes ! Always # # # # 61 21. When collecting case history information for the English learning children on my caseload, I ask parents about typical developmental milestones. ! Never ! Usually ! Sometimes ! Always# 22. When collecting case history information for the monolingual English-speaking children on my caseload, I ask about presence of family history of language impairment. ! Never ! Usually ! Sometimes ! Always 23. When collecting case history information for the English learning children on my caseload, I ask about presence of family history of language impairment? ! Never ! Usually ! Sometimes ! Always 24. I find information provided by parents helpful when determining eligibility for services. ! Never ! Usually ! Sometimes ! Always 25. With regard to identification for speech-language services, do you think the English learning students in your school district are being over-identified, under-identified, or identified at the same rate as their monolingual peers? ! Over-identified ! Under-identified ! Identified at the same rate # # # # # # REFERENCES American Speech-Language-Hearing Association. (2015). Bilingual service delivery. Retrieved from: http://www.asha.org/Practice-Portal/Professional-Issues/Bilingual- Service-Delivery/ American Speech-Language-Hearing Association. (2015). Collaborating with interpreters. Retrieved from: http://www.asha.org/Practice-Portal/Professional- Issues/Collaborating-With-Interpreters/ American Speech-Language Hearing Association. (2014). Demographic profile of ASHA members. Retrieved from: www.asha.org/uploadedFiles/Demographic-Profile- Bilingual-Spanish-Service-Members.pdf Artiles, A.J., Rueda, R., Salazar, J.J., & Higareda, I. (2005). Within-group diversity in# minority disproportionate representation: English language learners in urban# school districts. Council for Exceptional Children, 71(3), 283-300. # Aud, S., Fox, M.A., & KewalRamani, A. (2010). Status and trends in the education of racial and ethnic groups. National Center for Education Statistics. Retrieved from: http://nces.ed.gov/pubs2010/2010015.pdf Bedore, L.M., Peña, E.D., Joyner, D., & Macken, C. (2011). Parent and teacher rating of # bilingual language proficiency and language development concerns. # International Journal of Bilingual Education and Bilingualism, 14, 489-511. # Brandel, J., & Loeb, D.F. (2011). Program intensity and service delivery models in the schools: SLP survey results. Language, Speech, and Hearing Services in Schools, 42, 461-490. Caesar, L.G., & Kohler, P.D. (2007). The state of school-based bilingual assessment:# Actual practice versus recommended guidelines. Language, Speech, and Hearing# Services in Schools, 38, 190-200. Campbell, L., & Taylor, O. (1992). ASHA-certified speech-language pathologists: # Perceived competency levels with selected skills. The Howard Journal of # Communication, 3(3/4), 163-176. # # # # # 63# Dowden, P., Alarcon, N., Vollan, T., Cumley, G.D., Kuehn, C.M., & Amtmann, D. (2006). Survey of SLP caseloads in Washington state schools: Implications and strategies for action. Language, Speech, and Hearing Services in Schools, 37, 104-117. FAIR. (2012). English language learners and public education in Utah. Retrieved from: http://www.fairus.org/DocServer/Utah_LEP_final.pdf Figueroa, R.A., & Newsome, P. (2006). The diagnosis of LD in English learners: Is in nondiscriminatory? Journal of Learning Disabilities, 39(3), 206-214. # Friberg, J.C. (2010). Considerations for test selection: How do validity and reliability# impact diagnostic decisions? Child Language Teaching and Therapy, 26(1), # 77-92. # Garcia, E.A., Roy, L.C., Okada, P.J., Perkins, S.D., & Wiebe, R.A. (2004). A comparison# of the influence of hospital-trained, ad-hoc, and telephone interpreters on # perceived satisfaction of limited English-proficient parents presenting to a # pediatric emergency department. Pediatric Emergency Care, 20(6), 373-378. # Gillam, S., & Gillam, R. (2013). Monitoring Indicators of Scholarly Language (MISL). Logan, UT: Utah State University. Gillam, R.B., Peña, E.D., Bedore, L.M., Bohman, T.M., & Mendez-Perez, A. (2013).# Identification of specific language impairment in bilingual children: I. assessment# in English. Journal of Speech, Language, and Hearing Research, 56, 1813-1823. # Girbau, D., & Schultz, R.G. (2008). Phonological working memory in Spanish-English bilingual children with and without specific language impairment. Journal of Communication Disorders, 41, 124-145. Gutiérrez-Clellen, V.F., & Kreiter, J. (2003). Understanding child bilingual acquisition using parent and teacher reports. Applied Psycholinguistics, 24, 267-288. Gutiérrez-Clellen, V.F., & Peña, E. (2001). Dynamic assessment of diverse children: A# tutorial. Language, Speech, and Hearing Services in Schools, 32, 212-224. # Gutiérrez-Clellen, & Simon-Cereijido. (2009). Using language sampling in clinical assessments with bilingual children: Challenges and future directions. Seminars in Speech and Language, 30(4), 234-245. Hammer, C.S., Detwiler, J.S., Detwiler, J., Blood, G.W., & Qualls, C.D. (2003). Speech-# language pathologists' training and confidence in serving Spanish-English # bilingual children. Journal of Communication Disorders, 37, 91-108. # # # # # # 64# Kan, P.F., & Windsor, J. (2010). Word learning in children with primary language# impairment: A meta-analysis. Journal of Speech, Language, and Hearing# Research, 53, 739-756. # Kapantzoglou, M., Restrepo, M.A., & Thompson, M.S. (2012). Dynamic assessment of# word learning skills: Identifying language impairment in bilingual children.# Language, Speech, and Hearing Services in Schools, 43, 81-89. # Kraemer, R., Coltisor, A., Kalra, M., Martinez, M., Savage, B., Summers, S., &# Varadharajan, S. (2013). The Speech-Language Assessment of English Language# Learning Students: A Non-Standardized Approach. Retrieved from: http://sig16perspectives.pubs.asha.org/article.aspx?articleid=1810079 Kraemer, R., & Fabiano-Smith, L. (2015) Language assessment trends for English Language Learning (ELL) children. In review. Communication Disorders Quarterly. Langdon, H.W. (2009). Providing optimal special education services to Hispanic # children and their families. Communication Disorders Quarterly, 30(2), 83-96. # Langdon, H.W., & Cheng, L.R.L. (2002). Collaborating with interpreters and translators: A guide for communication disorders professionals. Eau Claire, WI: Thinking Publications. Lidz, C.S. (1987). Dynamic assessment: An interactional approach to evaluating learning potential. New York: Guilford Press. Lidz, C.S. (1991). Practitioner's guide to dynamic assessment. New York: Guilford Press. Mayer, M. (1973). Frog where are you? New York: Dial Press. Mayer, M. (1975). One frog too many. New York: Dial Press. Miller, J., & Iglesias, A. (2012). Systematic Analysis of Language Transcripts (SALT), Research Version 2012 [computer software]. Middleton, WI: SALT Software, LLC. More, C.M., Hart, J.E., & Cheathum, G.A. (2013). Language interpretation for diverse# families: Considerations for Special Education Teachers. Intervention in School# and Clinic, 49(2), 113-120. Morgan, P.L., Farkas, G., Hillemeier, M.M., Mattison, R., Maczuga, S., Li, H., & Cook, M. (2015). Minorities are disproportionately underrepresented in special education: Longitudinal evidence across five disability conditions. Educational researcher, 1-15. # # # # # 65# Mullen, R., & Schooling, T. (2010). The national outcomes measurement system for pediatric speech-language pathology. Language, Speech, and Hearing Services in Schools, 41, 44-60. National Institute of Health. (n.d.). PROMIS: Dynamic Tools to Measure Health Outcomes from the Patient Perspective. Retrieved from: (http://www.nihpromis.org/measures/SampleQuestions). Paradis, J., Emmerzael, K., & Duncan, T.S. (2010). Assessment of English language# learners: Using parent report on first language development. Journal of# Communication Disorders, 43, 474-497. # Paradis, J., Schneider, P., & Duncan, T.S. (2013). Discriminating children with language # impairment among english-language learners from diverse first-language # backgrounds. Journal of Speech, Language, and Hearing Research 56,# 971-978. # Paul, R., & Norbury, C.F. (2012). Language disorders from infancy through adolescence: Listening, speaking, reading, writing, and communicating (4th ed.). St. Louis, MO: Elsevier Mosby. Pearson, B.Z. (2002). Narrative competence among monolingual and bilingual school # children in Miami. In D.K. Oller & R.E. Eilers (Eds), Language and literacy in# bilingual children (pp. 135-174). Clevedon, UK: Multilingual Matters. Peña, E.D., Gillam, R.B., Bedore, L.M., & Bohman, T.M. (2011). Risk for poor # performance on a language screening measure for bilingual preschoolers and # kindergarteners. American Journal of Speech-Language Pathology, 20, 302-314.# Peña, E.D., Gillam, R.B., & Bedore, L.M. (2014). Dynamic assessment of narrative# ability in English accurately identifies language impairment in English language# learners. Journal of Speech, Language, and Hearing Research, 57, 2208-2220. Peña, E.D., Gutiérrez-Clellen, V.F., Iglesias, A., Goldstein, B.A., & Bedore, L.M. (2014). The Bilingual English-Spanish Assessment (BESA). Petaluma, CA: AR-Clinical Publications. Roulstone, S., Peters, T.J., Glogowska, M., & Enderby, P. (2008). Predictors and# outcomes of speech and language therapists' treatment decisions. International# Journal of Speech-Language Pathology, 10(3), 146-155. Ryan, C. (2013). Language use in the United States: 2011. American Community Survey Reports. Retrieved from: http://www.census.gov/prod/2013pubs/acs-22.pdf. # # # # # 66# Summers, C., Bohman, T.M., Gillam, R.B., Peña, E.D., & Bedore, L.M. (2010). # Bilingual performance on nonword repetition in Spanish and English. # International Journal of Language & Communication Disorders, 45(4),# 480-493. # Squires, K.E., Lugo-Neris, M.J., Peña, E.D., Bedore, L.M., Bohman, T.M., & Gillam,# R.B. (2013). Story retelling by bilingual children with language impairments and typically developing controls. International Journal of Language & Communication Disorders, 49(1), 60-74. # Stanton-Chapman, T.L., Chapman, D.A., Bainbridge, N.L., & Scott, K.G. (2002). # Identification of early risk factors for language impairment. Research in Developmental Disabilities, 23, 390-405. Trauner, D., Wulfeck, B., Tallal, P., & Hesselink, J. (2000). Neurological and MRI profiles of children with developmental language impairment. Developmental Medicine & Child Neurology, 42, 470-475. United States Census Bureau. (2015). 2014 National Population Projections. Retrieved from: https://www.census.gov/population/projections/data/national/2014.html Utah State Office of Education. (2010). Assessment and accountability. Retrieved from: http://www.schools.utah.gov/assessment/SE.aspx U.S. Department of Education Office for Civil Rights. (2000). The provision of an equal Education opportunity to limited-English proficient students. Retrieved from: http://www2.ed.gov/about/offices/list/ocr/eeolep/index.html # |
| Reference URL | https://collections.lib.utah.edu/ark:/87278/s6v15d48 |



