Masses of digital text. AIs with a large context window can process much more of it!

Gemini’s Long Context Window – a True Spec Cruncher

Maybe you’ve noticed that Google’s Gemini has been making gains on ChatGPT lately. Of all its recent impressive improvements, one of the lesser-sung features – at least in AI for Ed circles – is its much enhanced context window.

The context window is essentially how much text the AI can ‘remember’, and work with.  Google’s next model boasts one million characters of this memory, leaving other models – which count their own in the hundreds of thousands – in the dust. It blows open the possibilities for a particular kind of AI task: working with long texts.

Language learners make use of all kinds of texts, of course. But one particularly unwieldy (although hugely useful) type where this new feature could help is the exam spec.

Exam Spec Crunching with AI

Language exam specs are roadmaps to qualifications, listing the knowledge and skills students need to demonstrate linguistic competency. But they have a lot of fine detail that can bog us down.

As a content creator, one thing that challenges me is teasing out this detail into some kind of meaningful arrangement for student activities. There is a mass of vocab data in there. And as systematic as it is, abstract lists of connectives, temporal adverbs and helper verbs don’t make for very student-friendly lesson material.

With a massive text cruncher like Gemini, they are a lot easier to process. Just drag in your spec PDF (I’ve been playing around with the new AQA GCSE German doc), and tease out the material in a more useful format for planning:

Take this German exam spec, and create an outline plan of three terms of twelve lessons that will cover all of the thematic material.

Additionally, it can help in creating resources that cover all bases:

Create a short reading text to introduce students to the exam topic “Celebrity Culture”. It should be appropriate for students aiming for the top tier mark in the spec. In the text, make sure to include all of the prepositions from the prescribed word list.

With a long textual memory, it’s even possible to interrogate the spec after you’ve uploaded it. That’s literally just asking questions of the document itself – and, with that bigger window, getting answers that don’t overlook half the content:

If students have one year to learn ALL of the prescribed vocabulary in the spec, how many words should they be learning a week? Organise them into weekly lists that follow a broadly thematic pattern.

Supersized Context Window – Playing Soon at an AI Near You!

For sure, you can use these techniques on existing platforms straight away. However, due to the smaller context window, results might not always be 100% reliable (although it’s always fun trying!). For the new Google magic, we’ll have to wait just a little longer. 

But from the initial signs, it definitely looks worth the wait!

Gemini’s new supersized context window is available only in a limited released currently, and only via its AI Studio playground. Expect to see it coming to Gemini Advanced very soon!

Foreign alphabet soup (image generated by AI)

AI Chat Support for Foreign Language Alphabets

I turn to AI first and foremost for content creation, as it’s so good at creating model foreign language texts. But it’s also a pretty good conversational tool for language learners.

That said, one of the biggest obstacles to using LLMs like ChatGPT for conversational practice can be an unfamiliar script. Ask it to speak Arabic, and you’ll get lots of Arabic script. It’s usually smart enough to work out if you’re typing back using Latin characters, but it’ll likely continue to speak in script.

Now, it’s easy enough to ask your AI platform of choice to transliterate everything into Latin characters, and expect the same from you – simply instruct it to do so in your prompts. But blanket transliteration won’t help your development of native reading and writing skills. There’s a much better best of both worlds way that does.

Best of Both Worlds AI Chat Prompt

This prompt sets up a basic conversation environment. The clincher is that is give you the option to write in script  or not. And if not, you’ll get what script should look like modelled right back at you. It’s a great way to jump into conversation practice even before you’re comfortable switching keyboard layouts.

You are a Modern Greek language teacher, and you are helping me to develop my conversational skills in the language at level A2 (CEFR). Always keep the language short and simple at the given level, and always keep the conversation going with follow-up questions.

I will often type in transliterated Latin script, as I am still learning the target language alphabet. Rewrite all of my responses correctly in the target language script with any necessary grammatical corrections.

Similarly, write all of your own responses both in the target language script and also a transliteration in Latin characters. For instance,

Καλημέρα σου!
Kaliméra sou!

Do NOT give any English translations – the only support for me will be transliterations of the target language.

Let’s start off the conversation by talking about the weather.

This prompt worked pretty reliably in ChatGPT-4, Claude, Copilot, and Gemini. The first two were very strong; the latter two occasionally forget the don’t translate! instruction, but otherwise, script support – the name of the game here – was good throughout.

Try changing the language (top) and topic (bottom) to see what it comes up with!

 

An illustration of a robot scribe writing AI prompts for ChatGPT or Gemini

Power to the Prompts : Fave Tweaks for AI Worksheets

Content creation is what AI excels at. That’s a gift to language learners and teachers, as it’s the easiest thing in the world to create a set of prompts to run off original, immersive worksheets.

AI isn’t great at everything, though, which is why our prompts need to tweak for its weaknesses – the things that are obvious gotchas to language folk, but need explaining to a general assistant like AI. Fortunately, in most cases, all you need is an extra line or two to put it right.

Here are five of my favourite prompt-enhancers for worksheets, covering everything from vocab to copyright!

Five Tweaks for Perfect Prompts

Cut Out the Cognates

Cognates are generally great for language learners. As words that are instantly recognisable, they’re extra vocab for free. But for that very reason, they’re not the ones that worksheets should be making a song and dance about. It’s not particularly useful, for example, to have “der Manager” picked out in your German glossary as a key word. Yes, I worked that one out!

Try this in your vocab list prompts, bearing in mind that not all platforms will work equally well with it (Gemini aced it – ChatGPT sometimes gets it):

Don’t list items that are obvious English loanwords or cognates with English.

Highlight the Good Stuff

You know what I mean by the good stuff – those structures and snippets of languages that are really frequent, and really reusable. I like to call them sentence frames – you can learn them, and switch in other words to add to your own linguistic repertoire.

You can ask AI to draw attention to any really pertinent ones in your target language texts:

Highlight (in bold and italics) the most pertinent key words and phrases for the topic, and provide a brief glossary of them at the end.

Make It Colloquial

Vanilla AI can sound bookish and formal. That’s no good as a model for everyday speech, so polish your prompts with a wee push to the colloquial:

Make the language colloquial and idiomatic, in the style of a native speaker.

Include an Answer Key

It might seem obvious, but if you’re making materials for self-study, then you will find an answer sheet indispensable. It’s an often overlooked finishing touch that makes a worksheet truly self-contained:

Include an answer key for all questions at the end.

Covering Your Back With Copyright

Copyright issues have been bubbling in the background for LLMs for some time now. They produce texts based on vast banks on training data, which isn’t original material, of course. But in theory, the texts that pop out of it should be completely original.

It can’t hurt to make that explicit, though. I like to add the following line to prompts, especially if I’m intending to share the material beyond personal use:

Ensure that the text is completely original and not lifted directly from any other source.

 

What are your favourite tweaks to make perfect prompts? Let us know in the comments!

Lots of websites floating around a central AI sphere

Powered by AI : Some Favourite Tools

If you’ve been following my recent posts, you’ll know that I’ve been embedding AI deeply into my language learning routine.

There are some truly min-blowing ways to incorporate ‘raw AI’ – using direct prompts with LLMs like ChatGPT – into your learning, from live activities with personality, to custom content creation. But likewise, there are plenty of ready-to-run, AI-infused sites that you can use for language fun.

Here are a few of my favourites!

KOME.AI – YouTube Transcript Generator

I came across this when asked by a friend struggling to transcribe a long conference talk video for work. Surely there’s some way to automate that? And sure there was, and it’s Kome.ai. 

It’s not the only transcription service out there – there are numerous ones, competing for supremacy – but it’s the most straightforward, it’s multilingual, and it’s free. It’s also fast, seemingly drawing on already-existing auto-captions where available, before kicking in with other tools where necessary. I pasted in a short news clip about the German teacher shortage – I had a transcript almost immediately.

Kome.ai, generating YouTube transcripts with AI

Delphi’s Digital Clones

Most of the best prompt strategies involve telling AI who you want it to be. Delphi.ai has taken that a step further, by digitally cloning experts in their field – their words and big takes, at least – and making them available to the public. Think ‘coach in a box’.

While the site is set up for those wishing to clone knowledge-imparting versions of themselves (language coach, anyone?), you can browse and chat with many of their demo models. The philosopher collection is particularly enlightening.

PERPLEXITY.AI

AI’s whole bag is text generation. Now, the big tech talk a bit game about these platforms also being digital assistants, but they’re basically content whizzes, and can still be lacking in other task performance areas. Searching seems to be one of these blind spots, which you’ll have realised quickly when faced with Bing’s sometimes laughably off-topic search results.

Perplexity.ai aims to change all that. The developers have taken an LLM, and purpose-designed it for finding sources and answering questions. Consequently, it’s much more useful for learners, educators, researchers, and anyone who doesn’t want their AI to completely miss the point. It’s the future of search.

Web search infused with AI from Perplexity.com

SUNO.AI

AI-generated music has been sneaking up on us all very quietly. It was text generation that was making all the bolshy fuss, up to now. Music was still very much experimental, and out of the question unless you were running models on a powerful testing machine.

But suddenly, we have services that can create whole songs – including lyrics – from a simple prompt. Suno.ai not only gives you that for free – you have ten tracks a day for nothing – but it’s fast, and uncannily good for an early release. And, although they don’t shout from the rooftops about it, it’s also a polyglot!

These aren’t just great, handy, fun sites to use. They also show how broad the brush of AI is, and will be, in the future. They offer a taste of how embedded the tech will become in all sorts of areas of our lives in the coming years.

Are there any emerging AI services you’re a fan of? Let us know if the comments!

An illustration of a cute robot looking at a watch, surrounded by clocks, illustrating AI time-out

Avoiding Time-Out with Longer AI Content

If you’re using AI platforms to create longer language learning content, you’ll have hit the time-out problem at some point.

The issue is that large language models like ChatGPT and Bard use a lot of computing power at scale. To keep things to a sensible minimum, output limits are in place. And although they’re often generous, even on free platforms, they can fall short for many kinds of language learning content.

Multi-part worksheets and graded reader style stories are a case in point. They can stretch to several pages of print, far beyond most platform cut-offs. Some platforms (Microsoft Copilot, for instance) will just stop mid-sentence before a task is complete. Others may display a generation error. Very few will happily continue generating a lengthy text to the end.

You can get round it in many cases by simply stating “continue“. But that’s frustrating at best. And at worst, it doesn’t work at all; it may ignore the last cut-off sentence, or lose its thread entirely. I’ve had times when a quirky Bing insists it’s finished, and refuses, like a surly tot, to pick up where it left off.

Avoiding Time-Out with Sectioning

Fortunately, there’s a pretty easy fix. Simply specify in your prompt that the output should be section by section. For example, take this prompt, reproducing the popular graded reader style of language learning text but without the length limits:

You are a language tutor and content creator, who writes completely original and exciting graded reader stories for learners of all levels. Your stories are expertly crafted to include high-frequency vocabulary and structures that the learner can incorporate into their own repertoire.

As the stories can be quite long, you output them one chapter at a time, prompting me to continue with the next chapter each time. Each 500-word chapter is followed by a short glossary of key vocabulary, and a short comprehension quiz. Each story should have five or six chapters, and have a well-rounded conclusion. The stories should include plenty of dialogue as well as prose, to model spoken language.

With that in mind, write me a story for French beginner learners (A1 on the CEFR scale) set in a dystopian future.

By sectioning, you avoid time-out. Now, you can produce some really substantial learning texts without having to prod and poke your AI to distraction!

There may even be an added benefit. I’ve noticed that the quality of texts output by section may even be slightly higher than with all-at-once content. Perhaps this is connected to recent findings that instructing AI to thing step by step, and break things down, improves results.

If there is a downside, it’s simply that sectioned output with take up more conversational turns. Instead of one reply ‘turn’, you’re getting lots of them. This eats into your per-conversation or per-hour allocation on ChatGPT Plus and Bing, for example. But the quality boost is worth it, I think.

Has the section by section trick improved your language learning content? Let us know your experiences in the comments!

An image of a robot struggling with numbreed blocks. AI has a problem with random ordering.

Totally Random! Getting Round AI Random Blindness in Worksheet Creation

If you’re already using AI for language learning content creation, you’ve probably already cried in horror at one of its biggest limitations. It’s terrible at putting items in a random order.

Random order in language learning exercises is pretty essential. For instance, a ‘missing words’ key below a gap-fill exercise should never list words in the same order as the questions they belong to.

Obvious, right? Well, to AI, it isn’t!

Just take the following prompt, which creates a mini worksheet with an introductory text and a related gap-fill exercise:

I am learning French, and you are a language teacher and content creator, highly skilled in worksheet creation.
Create a French worksheet for me on the topic “Environmentally-Friendly Travel”. The language level should be A2 on the CEFR scale, with clear language and a range of vocabulary and constructions.
The worksheet starts with a short text in the target language (around 250 words) introducing the topic.
Then, there follows a gap-fill exercise; this consists of ten sentences on the topic, related to the introductory text. A key content word is removed from each sentence for the student to fill in. For instance, ‘je —— en train’ (where ‘voyage’ is removed).
Give a list of the removed words in a random order below the exercise.

The output is very hit and miss – and much more miss! Perhaps 90% of the time, ChatGPT lists the answer key in the order of the questions. Either that, or it will produce feeble jumbling attempts, like reversing just the first two items on the list.

AI’s Random Issue

One prompt-tweaking tip you can try in these cases is SHOUTING. Writing this instruction in caps can sometimes increase the bullseyes. Put them IN RANDOM ORDER, darn it! It doesn’t help much here, though. It just doesn’t seem worth relying on Large Language Models like ChatGPT to produce random results.

The reason has something to do with the fundamental way these platforms function. They’re probability machines, guessing what word should come next based on calculations of how likely word X, Y or Z will be next. Their whole rationale is not to be random; you might even call then anti-random machines.

No wonder they’re rubbish at it!

A Road Less Random

So how can we get round this in a reliable way that works every time?

The simplest fix, I’ve found, is to find another, non-random way to list things differently from the question order. And the easiest way to do that is to simply list things alphabetically:

I am learning French, and you are a language teacher and content creator, highly skilled in worksheet creation.
Create a French worksheet for me on the topic “Environmentally-Friendly Travel”. The language level should be A2 on the CEFR scale, with clear language and a range of vocabulary and constructions.
The worksheet starts with a short text in the target language (around 250 words) introducing the topic.
Then, there follows a gap-fill exercise; this consists of ten sentences on the topic, related to the introductory text. A key content word is removed from each sentence for the student to fill in. For instance, ‘je —— en train’ (where ‘voyage’ is removed).
Give a list of the removed words in alphabetical order below the exercise.

The likelihood of this order being the same as the questions is minimal. Hilariously, AI still manages to mess this order up at times, adding the odd one or two out-of-place at the end of the list, as if it forgot what it was doing, realised, and quickly bunged them back in. But the technique works just fine for avoiding the order giving the answers away.

A simple fix that basically ditches randomness completely, yes. But sometimes, the simplest fixes are the best!

Random blindness is a good reminder that AI isn’t a magical fix-all for language learning content creation. But, with an awareness of its limitations, we can still achieve some great results with workarounds.

Does AI have a noun problem? Strategies for avoiding it.

AI Has A Noun Problem : Let’s Fix It!

If you’re using AI for language learning content creation, you might have already spotted AI’s embarrassing secret. It has a noun problem.

Large Language Models like ChatGPT and Bard are generally great for creating systematic learning content. They’re efficient brainstormers, and can churn out lists and texts like there’s no tomorrow. One use case I’ve found particularly helpful is the creation of vocab lists – all the more so since it can spool them off in formats to suit learning tools like Anki.

But the more I’ve used it, the more it’s become apparent. AI has a blind spot that makes these straight-out-the-box vanilla lists much less useful than they could be.

A fixation with nouns.

Test it yourself; ask your platform of choice simply to generate a set of vocab items on a topic. Chances are there’ll be precious few items that aren’t nouns. And in my experience, more often than not, lists are composed entirely of noun items and nothing else.

ChatGPT-4 giving a list of French vocabulary items - all nouns.

ChatGPT-4 giving a list of French vocabulary items – all nouns.

It’s a curious bias, but I think it has something to do with how the LLM conceives key words. The term is somehow conflated with all the things to do with a topic. And nouns, we’re taught at school, are thing words.

Getting Over Your Noun Problem

Fortunately, there’s therapy for your AI to overcome its noun problem. And like most AI refining strategies, it just boils down to clearer prompting.

Here are some tips to ensure more parts-of-speech variety in your AI language learning content:

  1. Explicit Instruction: When requesting vocabulary lists, spell out what you want. Specify a mix of word types – nouns, verbs, adjectives, adverbs, etc. to nudge the AI towards a more balanced selection. When it doesn’t comply, just tell it so! More verbs, please is good start.
  2. Increase the Word Count: Simply widening the net can work, if you’re willing to manually tweak the list afterwards. Increase you vocab lists to 20 or 30 items, and the chances of the odd verb or adjective appearing are greater.
  3. Contextual Requests: Instead of asking for lists, ask the AI to provide sentences or paragraphs where different parts of speech are used in context. This not only gives you a broader range of word types, but also shows them in action.
  4. Ask for Sentence Frames: Instead of single items, ask for sentence frames (or templates) that you can swap words in an out of. For instance, request a model sentence with a missing verb, along with 10 verbs that could fill that spot. “I ____ bread” might be a simple one for the topic food.
  5. Challenge the AI: Regularly challenge the AI with tasks that require a more nuanced understanding of language – like creating stories, dialogues, or descriptive paragraphs. This can push its boundaries and improve its output.

Example Prompts

Bearing those tips in mind, try these prompts for size. They should produce a much less noun-heavy set of vocab for your learning pleasure:

Create a vocabulary list of 20 French words on the topic “Food and Drink”. Make sure to include a good spread of nouns, verbs, adjectives and adverbs. For each one, illustrate the word in use with a useful sentence of about level A2 on the CEFR scale.
Give me a set of 5 French ‘sentence frames’ for learning and practising vocabulary on the topic “Summer Holidays”. Each frame should have a missing gap, along with five examples of French words that could fit in it.
Write me a short French text of around level A2 on the CEFR scale on the topic “Finding a Job in Paris”. Then, list the main content words from the text in a glossary below in table format.

Have you produced some useful lists with this technique? Let us know in the comments!

AI prompt engineering - the toolkit for getting better results from your platform of choice.

Better AI Language Learning Content with C-A-R-E

AI isn’t just for chat – it’s also great at making static language learning content. And as AI gains ground as a content creation assistant, prompt engineering – the art of tailoring your requests – becomes an ever more important skill.

As you’d expect, frameworks and best practice guides abound for constructing the perfect prompt. They’re generally all about defining your request with clarity, in order to minimise AI misfires and misunderstandings. Perhaps the most well-known and effective of these is R-T-F – that’s role, task, format. Tell your assistant who it is, what to do, and how you want the data to look at the end of it.

Recently, however, I’ve been getting even more reliable MFL content with another prompt framework: C-A-R-E. That is:

  • Context
  • Action
  • Result
  • Example(s)

Some of these steps clearly align with R-T-F. Context is a broader take on role, action matches to task and result roughly to format. But the kicker here is the addition of example(s). A wide-ranging academic investigation into effective prompting recently flagged “example-driven prompting” as an important factor in improving output, and for good reason: the whole concept of LLMs is built on constructing responses from training data. It’s built on the concept of parroting examples.

Crafting AI prompts with C-A-R-E

As far as language content is concerned, C-A-R-E prompting is particularly good for ‘fixed format’ activity creation, like gap-fills or quizzes. There’s a lot of room for misinterpretation when describing a word game simply with words; a short example sets AI back on track. For example:

– I am a French learner creating resources for my own learning, and you are an expert language learning content creator.
– Create a gap-fill activity in French for students around level A2 of the CEFR scale on the topic “Environment”.
– It will consist of ten sentences on different aspects of the topic, with a key word removed from each one for me to fill out. Provide the missing words for me in an alphabetically sorted list at the end as a key.
– As an example, a similar question in English would look like this: “It is very important to look after the ———- for future generations.”

This produces excellent results in Microsoft Copilot / Bing (which we love for the freeness, obviously!) and ChatGPT. For example:

Creating AI language learning content with Microsoft Copilot / Bing Chat

Creating AI language learning content with Microsoft Copilot / Bing Chat

Providing short examples seems like an obvious and intuitive step, but it’s surprising how infrequently we tend to do it in our AI prompts. The gains are so apparent, that it’s worth making a note to always add a little C-A-R-E to your automatic content creation.

If you’ve been struggling to get reliable (or just plain sensible!) results with your AI language learning content, give C-A-R-E a try – and let us know how it goes in the comments!

ChatGPT releases custom GPT models

ChatGPT, Your Way : Custom GPTs In The Wild!

This week saw one of the biggest recent developments in consumer AI. ChatGPT released GPTs – customisable AI bots – into the wild for Plus members, and the community has gone wild.

In a nutshell, GPTs are AI bots with custom behaviour that you define. And you define that behaviour using natural language, just like how you talk to regular ChatGPT.

Crucially, GPTs are shareable. So you can come up with a killer app idea, set it up in seconds, then share your creation with the world. Already, linguists and language lovers are sharing their creations on the socials.

ChatGPT for Worksheet Creation

Obviously, I couldn’t wait to get playing when the GPT creation tool went live this week. I’ve long been a cheerleader for topic-based units for independent study, especially when preparing for spoken lessons. So the first thing I coded up was a foreign language worksheet creator!

It’s the kind of thing I’ve been writing and sharing prompts about for a while, now. The big game-changer, of course, is that now, all that functionality is packaged up into a single, one-click module. Open it, tell it your language, topic and level, and watch it go. This will produce a range of resources and activities for independent learning, including a vocabulary list, reading comprehensions, and cloze quizzes.

Genuinely useful for self-study!

Foreign Language Worksheet Creator GPT in ChatGPT

Foreign Language Worksheet Creator GPT in ChatGPT

It’s already been a learning experience, for all of us tinkerers. For one thing, I found out not to overload it by trying to do too much at once, or turning on all its capabilities (browsing, code interpretation and image creation). I ended up with a uselessly slow initial version that I can no longer even reopen to edit.

Ah well – these things make us!

Old English Monkeys

When you do get a working version, however, you can boggle at the versatility of it. That’s thanks to the billions of training points backing up the platform. I asked it to create an Old English worksheet on the topic “Monkeys”, in the style of a Modern Languages worksheet, as a cheeky wee test. Admittedly, ChatGPT did say that it would be a challenging task. After all, just how many Old English documents do researchers train their LLMs on? But the results were really not bad at all…

An Old English worksheet in ChatGPT

An Old English worksheet in ChatGPT

 

I expect many of us are playing these games, pushing the new tech to see how far it can go. At the very least, we can all revisit those isolated prompt ideas we’ve been collecting over the past months, and turn them into shareable GPTs – for work and for fun.

Have you had chance to play yet? Share your proud creations with us in the comments!

An illustration of a robot taking a picture of a book page, to illustrate AI image analysis in the context of language learning course books.

AI Image Analysis for Language Learners : Your Course Book Assistant!

Image upload and analysis is one of the most game-changing recent additions to AI platforms. Combined with a knack for text recognition, it’s possibly one of the most revolutionary for language (and other!) learners, too.

In short, if it’s on a page, you can now get it into AI and do things with it. Because of this. image analysis has huge potential for extending, and breathing new life into printed materials, producing the very best synthesis of old and new tech.

A screenshot of AI chat in the Bing App, with an arrow showing the 'upload an image' function.

The image upload icon in the Bing app.

At its very simplest, it’s a handy summary and explanation tool. Just upload your page image, and prompt:

Analyse this page from my [LANGUAGE] course book. Summarise it in a few short bullet points I can use for revision.

Useful in its own right. But with some extra prompt magic, you can produce individually tailored support material on the fly – material that will help you to delve really deeply into those language learning texts, making it work for you.

Let’s see what it can do for starters!

Working with Vocab Lists

Vocab in Context

Take the most conventional form of book-based, language learning data. Most course books have vocabulary lists and glossaries of words in the current chapter top. But beyond the dialogues or passages they are attached to, there’s rarely any other in context use of them.

Personally, I find it really helpful to see individual items embedded in sentence examples as an aide memoire. I usually seek them out in mass sentence banks and other manual-search online resources.

Even easier with AI:

Analyse this entire page from my [LANGUAGE] course book, noting all of the vocabulary items internally. Then, create a useful, practical sentence using each and every item. The sentences should relate to real-world contexts where possible. Make sure you include every single entry – don’t leave any out. Constantly double-check that the language is natural-sounding and grammatically correct. Output them in table format listing the word, your sentence and an English translation of that sentence.

ChatGPT Plus analysing a page of Swedish vocabulary.

ChatGPT Plus analysing a page of Swedish vocabulary.

ChatGPT Plus analysing a page of Swedish vocabulary.

The trick here is the analyse the entire page instruction. LLM / AI platforms tend to take shortcuts when working with lists, sometimes skipping list items. Adding this stipulation is great at keeping it on track!

Rationalising Vocab Lists

You can also sort such material in an order that works better for you. For instance, I work best with vocab when I classify it first, be that by parts of speech, topic or otherwise. AI makes light work of it:

This is the material I’m currently studying in [LANGUAGE]. First, analyse the entire page, noting all of the vocabulary items listed. Then, rewrite that list, grouping the items by their grammatical part of speech and in alphabetical order. Where the word isn’t in its simple dictionary form, provide that too. Include any entries you couldn’t categorise at the end. Double-check throughout the process that a) you haven’t left out any items, and b) that your categorisation of each item is correct. If you detect errors, start again.

ChatGPT Plus analysing a page of Swahili vocabulary to create model sentences for context.

Microsoft Bing analysing a page of Swahili vocabulary to create model sentences for context.

Microsoft Bing analysing a page of Swahili vocabulary to create model sentences for context.

You can also combine this with the AI Anki decks trick to really digitise those paper lists.

AI Translation Exercises

Now, how about some methods for actively working with vocab? Personally, I’m a  big fan of the translation method. Now I know this isn’t everybody’s cup of tea (it’s one thing that turns some off Duo) but if it works for you, you can produce a raft of exercises in seconds:

Here’s a page from my [LANGUAGE] course book. Analyse the entire page, noting all vocabulary items internally. Then, create a set of 20 practical, useful sentences using this vocabulary in context. Make them relevant to real-world, current affairs contexts where possible. Present half of these sentences in English and half in the target language for me to translate for practice. Add a key for any extra words you used that aren’t included in the list, as support. Add the translations of all sentences at the end as an answer key.

ChatGPT Plus analysing a page of Hebrew vocabulary to create translation exercises.

Bing analysing a page of Hebrew vocabulary to create translation exercises.

AI Exercises

You can also extend course book pages with worksheet-style practice exercises. Here’s a prompt that should produce a diverse set of activities in an output perfect for copy-pasting into a note, or PDF, to pore through on the move:

Here’s a page from my [LANGUAGE] course book. First, analyse the entire page, noting all vocabulary items, sentence frames and grammatical structures internally. Then, create a set of worksheet-style activities for me to practise using that material. Vary the activity types, including exercises like gap-fill / cloze, matching and translation. Add an answer key to all exercises at the end.

You might even like to try a more dynamic approach with this paper-to-exercise technique. The following prompt should set up a turn-based game (my favourite kind!) that recycles chapter vocab in live conversation:

Here’s a page from my [LANGUAGE] course book. First, analyse the entire page, noting all vocabulary items, sentence frames and grammatical structures internally. Then, let’s have a conversational, turn-based activity using the material. Present me, turn by turn, with a sentence in the target language using the vocabulary. I have to provide the missing word. Don’t give me any clues or model answers until I’ve made my response each turn!

Admittedly, turn-based language gaming worked better in Bing before recent updates forced it to focus solely on being a fancy search engine. If it does stray, just remind it that you’re playing a vocabulary game!

Choose Your Platform!

All these prompts have one thing in common: they play to the power of AI to take information and display it in different ways. That’s gold for learners, as the human brain learns best when presented with material in multiple, not monotonous formats. For one thing, this helps beat the context trap of repetitious learning. Recycling vocab in as many ways as possible is key to remembering it in unlimited, unpredictable future situations.

Tech-wise, you’ll see that I’ve used Bing in most of these examples. It’s an excellent place to start if you’re new to AI yourself, not leave because it’s accessible, user-friendly and completely free! Additionally, the Bing app allows you to snap a book page easily with your phone camera. And Bing’s internet connectivity out-of-the-box gives it more breadth and up-to-the-minute relevance when creating your materials.

That said, you can use these prompts with any platform that allows image uploads. ChatGPT, for instance, has the added bonus of multiple uploads – ie., pages – so you can process a larger chunk of chapter in one go.

Whichever platform you choose, the most important piece of advice remains the same: don’t just stick with these potted prompts. Instead, experiment constantly to find what works for you, building up your own prompt library to copy-paste. AI can, and should, be an incredibly personalised experience. Good luck making it your own!

Have you used AI image analysis in your learning? Let us know your own tips and tricks in the comments!