My previous post introduces Common European Framework of Reference for Languages(CEFR), the de facto standard of measuring one’s Spanish language proficiency level. Now that a learner knows roughly their level of proficiency, they may want to know if a specific Spanish text is about the right level for them. Materials that are too easy or too difficult are not conducive to effective learning.
Duolingo has made available online a CEFR checker that is Artificial- Intelligence-based. You paste the target Spanish text into a textbox, and it will tell you the CEFR level overall and for each individual word.
Like all things A.I. these days, when it works, it is great, but it can look silly otherwise. The above example—my first test of the tool—showed the output for the sentence ‘Las abogadas entregan los papales.’ I took the sentence straight out of an actual Duolingo exercise. Overall, the sentence was classified as a ‘C’ or advanced. That is puzzling, to say the least because the sentence should not require an advanced reading proficiency level. Further examination of the output reveals that there is only one ‘C’ word according to the checker—’papeles’, meaning papers. Why is ‘papeles’ even an advanced word?
Below is another test I conducted, using the sentence ‘Nosotros importamos cerveza de Alemania’.
It seems that the checker gave up on classifying the word ‘importamos’. Strange, the exact sentence came from a Duolingo exercise. Does the checker have problems understanding conjugation?
Presently the checker does seem rough around the edges and is not at the same maturity level as the main app, Duolingo itself.