Lots of websites floating around a central AI sphere

Powered by AI : Some Favourite Tools

If you’ve been following my recent posts, you’ll know that I’ve been embedding AI deeply into my language learning routine.

There are some truly min-blowing ways to incorporate ‘raw AI’ – using direct prompts with LLMs like ChatGPT – into your learning, from live activities with personality, to custom content creation. But likewise, there are plenty of ready-to-run, AI-infused sites that you can use for language fun.

Here are a few of my favourites!

KOME.AI – YouTube Transcript Generator

I came across this when asked by a friend struggling to transcribe a long conference talk video for work. Surely there’s some way to automate that? And sure there was, and it’s Kome.ai. 

It’s not the only transcription service out there – there are numerous ones, competing for supremacy – but it’s the most straightforward, it’s multilingual, and it’s free. It’s also fast, seemingly drawing on already-existing auto-captions where available, before kicking in with other tools where necessary. I pasted in a short news clip about the German teacher shortage – I had a transcript almost immediately.

Kome.ai, generating YouTube transcripts with AI

Delphi’s Digital Clones

Most of the best prompt strategies involve telling AI who you want it to be. Delphi.ai has taken that a step further, by digitally cloning experts in their field – their words and big takes, at least – and making them available to the public. Think ‘coach in a box’.

While the site is set up for those wishing to clone knowledge-imparting versions of themselves (language coach, anyone?), you can browse and chat with many of their demo models. The philosopher collection is particularly enlightening.

PERPLEXITY.AI

AI’s whole bag is text generation. Now, the big tech talk a bit game about these platforms also being digital assistants, but they’re basically content whizzes, and can still be lacking in other task performance areas. Searching seems to be one of these blind spots, which you’ll have realised quickly when faced with Bing’s sometimes laughably off-topic search results.

Perplexity.ai aims to change all that. The developers have taken an LLM, and purpose-designed it for finding sources and answering questions. Consequently, it’s much more useful for learners, educators, researchers, and anyone who doesn’t want their AI to completely miss the point. It’s the future of search.

Web search infused with AI from Perplexity.com

SUNO.AI

AI-generated music has been sneaking up on us all very quietly. It was text generation that was making all the bolshy fuss, up to now. Music was still very much experimental, and out of the question unless you were running models on a powerful testing machine.

But suddenly, we have services that can create whole songs – including lyrics – from a simple prompt. Suno.ai not only gives you that for free – you have ten tracks a day for nothing – but it’s fast, and uncannily good for an early release. And, although they don’t shout from the rooftops about it, it’s also a polyglot!

These aren’t just great, handy, fun sites to use. They also show how broad the brush of AI is, and will be, in the future. They offer a taste of how embedded the tech will become in all sorts of areas of our lives in the coming years.

Are there any emerging AI services you’re a fan of? Let us know if the comments!

Two AI robots squaring up to each other

AI Worksheet Wars : Google Gemini Advanced vs. ChatGPT-4

With this week’s release of Gemini Advanced, Google’s latest, premium AI model, we have another platform for language learning content creation.

Google fanfares Gemini as the “most capable AI model” yet, releasing benchmark results that position it as a potential ChatGPT-4 beater. Significantly, Google claims that their new top model even outperforms humans at some language-based benchmarking.

So what do those improvements hold for language learners? I decided to put Gemini Advanced head-to-head with the leader to date, ChatGPT-4, to find out. I used the following prompt on both ChatGPT-4 and Gemini Advanced to create a topic prep style worksheet like those I use before lessons. A target language text, vocab support, and practice questions – perfect topic prep:

Create an original, self-contained French worksheet for students of the language who are around level A2 on the CEFR scale. The topic of the worksheet is “Reality TV in France“.

The worksheet format is as follows:

– An engaging introductory text (400 words) using clear and idiomatic language
– Glossary of 10 key words / phrases from the text (ignore obvious cognates with English) in table format
– Reading comprehension quiz on the text (5 questions)
– Gap-fill exercise recycling the same vocabulary and phrases in a different order (10 questions)
– ‘Talking about it’ section with useful phrases for expressing opinions on the topic
– A model dialogue (10-12 lines) between two people discussing the topic
– A set of thoughtful questions to spark further dialogue on the topic
– An answer key covering all the questions

Ensure the language is native-speaker quality and error-free.

I then laid out the results, with minimal extra formatting, in PDF files (much as I’d use them for my own learning).

Here are the results.

ChatGPT-4

ChatGPT-4, gives solid results, much as expected. I’d been using that platform for my own custom learning content for a while, and it’s both accurate dependable.

The introductory text referenced the real-world topic links very well, albeit a little dry in tone. The glossary was reasonable, although ChatGPT-4 had, as usual, problems leaving out “obvious cognates” as per the prompt instructions. It’s a problem I’ve noticed often, with other LLMs too – workarounds are often necessary to fix these biases.

Likewise, the gap-fill was not “in a different order”, as prompted (and again, exposing a weakness of most LLMs). The questions are in the same order as the glossary entries they refer to!

Looking past those issues – which we could easily correct manually, in any case – the questions were engaging and sensible. Let’s give ChatGPT-4 a solid B!

A French worksheet on Reality TV, created by AI platform ChatGPT-4.

You can download the ChatGPT-4 version of the worksheet from this link.

Gemini Advanced

And onto the challenger! I must admit, I wasn’t expecting to see huge improvements here.

But instantly, I prefer the introductory text. It’s stylistically more interesting; it’s just got the fact that I wanted it to be “engaging”. It’s hard to judge reliably, but I also think it’s closer to a true CEFR A2 language level. Compare it with the encyclopaedia-style ChatGPT-4 version, and it’s more conversational, and certainly more idiomatic.

That attention to idiom is apparent in the glossary, too. There’s far less of that cognate problem here, making for a much more practical vocab list. We have some satisfyingly colloquial phrasal verbs that make me feel that I’m learning something new.

And here’s the clincher: Gemini Advanced aced the randomness test. While the question quality matched ChatGPT-4, the random delivery means the output is usable off the bat. I’m truly impressed by that.

A French worksheet on Reality TV, created by Google's premium AI platform, Gemini Advanced.

You can download the Gemini Advanced version of the worksheet from this link.

Which AI?

After that storming performance by Gemini Advanced, you might expect my answer to be unqualified support for that platform. And, content-wise, I think it did win, hands down. The attention to the nuance of my prompt was something special, and the texts are just more interesting to work with. Big up for creativity.

That said, repeated testing of the prompt did throw up the occasional glitch. Sometimes, it would fail to output the answers, instead showing a cryptic “Answers will follow.” or similar, requiring further prompting. Once or twice, the service went down, too, perhaps a consequence of huge traffic during release week. They’re minor things for the most part, and I expect Google will be busy ironing them out over coming months.

Nonetheless, the signs are hugely promising, and it’s up to ChatGPT-4 now to come back with an even stronger next release. I’ll be playing around with Gemini Advanced a lot in the next few weeks – I really recommend that other language learners and teachers give it a look, too!

If you want to try Google’s Gemini Advanced, there’s a very welcome two-month free trial. Simply head to Gemini to find out more!

An illustration of a cute robot looking at a watch, surrounded by clocks, illustrating AI time-out

Avoiding Time-Out with Longer AI Content

If you’re using AI platforms to create longer language learning content, you’ll have hit the time-out problem at some point.

The issue is that large language models like ChatGPT and Bard use a lot of computing power at scale. To keep things to a sensible minimum, output limits are in place. And although they’re often generous, even on free platforms, they can fall short for many kinds of language learning content.

Multi-part worksheets and graded reader style stories are a case in point. They can stretch to several pages of print, far beyond most platform cut-offs. Some platforms (Microsoft Copilot, for instance) will just stop mid-sentence before a task is complete. Others may display a generation error. Very few will happily continue generating a lengthy text to the end.

You can get round it in many cases by simply stating “continue“. But that’s frustrating at best. And at worst, it doesn’t work at all; it may ignore the last cut-off sentence, or lose its thread entirely. I’ve had times when a quirky Bing insists it’s finished, and refuses, like a surly tot, to pick up where it left off.

Avoiding Time-Out with Sectioning

Fortunately, there’s a pretty easy fix. Simply specify in your prompt that the output should be section by section. For example, take this prompt, reproducing the popular graded reader style of language learning text but without the length limits:

You are a language tutor and content creator, who writes completely original and exciting graded reader stories for learners of all levels. Your stories are expertly crafted to include high-frequency vocabulary and structures that the learner can incorporate into their own repertoire.

As the stories can be quite long, you output them one chapter at a time, prompting me to continue with the next chapter each time. Each 500-word chapter is followed by a short glossary of key vocabulary, and a short comprehension quiz. Each story should have five or six chapters, and have a well-rounded conclusion. The stories should include plenty of dialogue as well as prose, to model spoken language.

With that in mind, write me a story for French beginner learners (A1 on the CEFR scale) set in a dystopian future.

By sectioning, you avoid time-out. Now, you can produce some really substantial learning texts without having to prod and poke your AI to distraction!

There may even be an added benefit. I’ve noticed that the quality of texts output by section may even be slightly higher than with all-at-once content. Perhaps this is connected to recent findings that instructing AI to thing step by step, and break things down, improves results.

If there is a downside, it’s simply that sectioned output with take up more conversational turns. Instead of one reply ‘turn’, you’re getting lots of them. This eats into your per-conversation or per-hour allocation on ChatGPT Plus and Bing, for example. But the quality boost is worth it, I think.

Has the section by section trick improved your language learning content? Let us know your experiences in the comments!

An image of a robot struggling with numbreed blocks. AI has a problem with random ordering.

Totally Random! Getting Round AI Random Blindness in Worksheet Creation

If you’re already using AI for language learning content creation, you’ve probably already cried in horror at one of its biggest limitations. It’s terrible at putting items in a random order.

Random order in language learning exercises is pretty essential. For instance, a ‘missing words’ key below a gap-fill exercise should never list words in the same order as the questions they belong to.

Obvious, right? Well, to AI, it isn’t!

Just take the following prompt, which creates a mini worksheet with an introductory text and a related gap-fill exercise:

I am learning French, and you are a language teacher and content creator, highly skilled in worksheet creation.
Create a French worksheet for me on the topic “Environmentally-Friendly Travel”. The language level should be A2 on the CEFR scale, with clear language and a range of vocabulary and constructions.
The worksheet starts with a short text in the target language (around 250 words) introducing the topic.
Then, there follows a gap-fill exercise; this consists of ten sentences on the topic, related to the introductory text. A key content word is removed from each sentence for the student to fill in. For instance, ‘je —— en train’ (where ‘voyage’ is removed).
Give a list of the removed words in a random order below the exercise.

The output is very hit and miss – and much more miss! Perhaps 90% of the time, ChatGPT lists the answer key in the order of the questions. Either that, or it will produce feeble jumbling attempts, like reversing just the first two items on the list.

AI’s Random Issue

One prompt-tweaking tip you can try in these cases is SHOUTING. Writing this instruction in caps can sometimes increase the bullseyes. Put them IN RANDOM ORDER, darn it! It doesn’t help much here, though. It just doesn’t seem worth relying on Large Language Models like ChatGPT to produce random results.

The reason has something to do with the fundamental way these platforms function. They’re probability machines, guessing what word should come next based on calculations of how likely word X, Y or Z will be next. Their whole rationale is not to be random; you might even call then anti-random machines.

No wonder they’re rubbish at it!

A Road Less Random

So how can we get round this in a reliable way that works every time?

The simplest fix, I’ve found, is to find another, non-random way to list things differently from the question order. And the easiest way to do that is to simply list things alphabetically:

I am learning French, and you are a language teacher and content creator, highly skilled in worksheet creation.
Create a French worksheet for me on the topic “Environmentally-Friendly Travel”. The language level should be A2 on the CEFR scale, with clear language and a range of vocabulary and constructions.
The worksheet starts with a short text in the target language (around 250 words) introducing the topic.
Then, there follows a gap-fill exercise; this consists of ten sentences on the topic, related to the introductory text. A key content word is removed from each sentence for the student to fill in. For instance, ‘je —— en train’ (where ‘voyage’ is removed).
Give a list of the removed words in alphabetical order below the exercise.

The likelihood of this order being the same as the questions is minimal. Hilariously, AI still manages to mess this order up at times, adding the odd one or two out-of-place at the end of the list, as if it forgot what it was doing, realised, and quickly bunged them back in. But the technique works just fine for avoiding the order giving the answers away.

A simple fix that basically ditches randomness completely, yes. But sometimes, the simplest fixes are the best!

Random blindness is a good reminder that AI isn’t a magical fix-all for language learning content creation. But, with an awareness of its limitations, we can still achieve some great results with workarounds.

Does AI have a noun problem? Strategies for avoiding it.

AI Has A Noun Problem : Let’s Fix It!

If you’re using AI for language learning content creation, you might have already spotted AI’s embarrassing secret. It has a noun problem.

Large Language Models like ChatGPT and Bard are generally great for creating systematic learning content. They’re efficient brainstormers, and can churn out lists and texts like there’s no tomorrow. One use case I’ve found particularly helpful is the creation of vocab lists – all the more so since it can spool them off in formats to suit learning tools like Anki.

But the more I’ve used it, the more it’s become apparent. AI has a blind spot that makes these straight-out-the-box vanilla lists much less useful than they could be.

A fixation with nouns.

Test it yourself; ask your platform of choice simply to generate a set of vocab items on a topic. Chances are there’ll be precious few items that aren’t nouns. And in my experience, more often than not, lists are composed entirely of noun items and nothing else.

ChatGPT-4 giving a list of French vocabulary items - all nouns.

ChatGPT-4 giving a list of French vocabulary items – all nouns.

It’s a curious bias, but I think it has something to do with how the LLM conceives key words. The term is somehow conflated with all the things to do with a topic. And nouns, we’re taught at school, are thing words.

Getting Over Your Noun Problem

Fortunately, there’s therapy for your AI to overcome its noun problem. And like most AI refining strategies, it just boils down to clearer prompting.

Here are some tips to ensure more parts-of-speech variety in your AI language learning content:

  1. Explicit Instruction: When requesting vocabulary lists, spell out what you want. Specify a mix of word types – nouns, verbs, adjectives, adverbs, etc. to nudge the AI towards a more balanced selection. When it doesn’t comply, just tell it so! More verbs, please is good start.
  2. Increase the Word Count: Simply widening the net can work, if you’re willing to manually tweak the list afterwards. Increase you vocab lists to 20 or 30 items, and the chances of the odd verb or adjective appearing are greater.
  3. Contextual Requests: Instead of asking for lists, ask the AI to provide sentences or paragraphs where different parts of speech are used in context. This not only gives you a broader range of word types, but also shows them in action.
  4. Ask for Sentence Frames: Instead of single items, ask for sentence frames (or templates) that you can swap words in an out of. For instance, request a model sentence with a missing verb, along with 10 verbs that could fill that spot. “I ____ bread” might be a simple one for the topic food.
  5. Challenge the AI: Regularly challenge the AI with tasks that require a more nuanced understanding of language – like creating stories, dialogues, or descriptive paragraphs. This can push its boundaries and improve its output.

Example Prompts

Bearing those tips in mind, try these prompts for size. They should produce a much less noun-heavy set of vocab for your learning pleasure:

Create a vocabulary list of 20 French words on the topic “Food and Drink”. Make sure to include a good spread of nouns, verbs, adjectives and adverbs. For each one, illustrate the word in use with a useful sentence of about level A2 on the CEFR scale.
Give me a set of 5 French ‘sentence frames’ for learning and practising vocabulary on the topic “Summer Holidays”. Each frame should have a missing gap, along with five examples of French words that could fit in it.
Write me a short French text of around level A2 on the CEFR scale on the topic “Finding a Job in Paris”. Then, list the main content words from the text in a glossary below in table format.

Have you produced some useful lists with this technique? Let us know in the comments!

AI prompt engineering - the toolkit for getting better results from your platform of choice.

Better AI Language Learning Content with C-A-R-E

AI isn’t just for chat – it’s also great at making static language learning content. And as AI gains ground as a content creation assistant, prompt engineering – the art of tailoring your requests – becomes an ever more important skill.

As you’d expect, frameworks and best practice guides abound for constructing the perfect prompt. They’re generally all about defining your request with clarity, in order to minimise AI misfires and misunderstandings. Perhaps the most well-known and effective of these is R-T-F – that’s role, task, format. Tell your assistant who it is, what to do, and how you want the data to look at the end of it.

Recently, however, I’ve been getting even more reliable MFL content with another prompt framework: C-A-R-E. That is:

  • Context
  • Action
  • Result
  • Example(s)

Some of these steps clearly align with R-T-F. Context is a broader take on role, action matches to task and result roughly to format. But the kicker here is the addition of example(s). A wide-ranging academic investigation into effective prompting recently flagged “example-driven prompting” as an important factor in improving output, and for good reason: the whole concept of LLMs is built on constructing responses from training data. It’s built on the concept of parroting examples.

Crafting AI prompts with C-A-R-E

As far as language content is concerned, C-A-R-E prompting is particularly good for ‘fixed format’ activity creation, like gap-fills or quizzes. There’s a lot of room for misinterpretation when describing a word game simply with words; a short example sets AI back on track. For example:

– I am a French learner creating resources for my own learning, and you are an expert language learning content creator.
– Create a gap-fill activity in French for students around level A2 of the CEFR scale on the topic “Environment”.
– It will consist of ten sentences on different aspects of the topic, with a key word removed from each one for me to fill out. Provide the missing words for me in an alphabetically sorted list at the end as a key.
– As an example, a similar question in English would look like this: “It is very important to look after the ———- for future generations.”

This produces excellent results in Microsoft Copilot / Bing (which we love for the freeness, obviously!) and ChatGPT. For example:

Creating AI language learning content with Microsoft Copilot / Bing Chat

Creating AI language learning content with Microsoft Copilot / Bing Chat

Providing short examples seems like an obvious and intuitive step, but it’s surprising how infrequently we tend to do it in our AI prompts. The gains are so apparent, that it’s worth making a note to always add a little C-A-R-E to your automatic content creation.

If you’ve been struggling to get reliable (or just plain sensible!) results with your AI language learning content, give C-A-R-E a try – and let us know how it goes in the comments!

Neon musical notes

Target Language Pop Music on tap! Meet Suno.ai

Pop music can be a really good route in foreign language learning.

Target language pop has long been a great way to practise listening. Diction tends to be slower and more deliberate, and you often have rhyme to help as an aide memoire. Learning snippets of song lyrics give you reusable phrases, as well as a feel for the sound shape of the language.

Up until now, it’s been a case of hunting down artists whose lyrics resonate with you. Easier said than done if you’re new to the target language culture. But now, thanks to a new AI-powered site, you can create music in a language and style of your choice, simply by asking for it!

Instant Music, On Demand

Suno.ai has emerged from obscurity in recent weeks to a flurry of excitement. It takes a musical prompt and transforms it into a fully threshed out track, vocals and all. And the best bit?

It works in multiple languages!

It’s incredibly simple to use – you don’t even need any actual lyrics to start with. In simple mode, you just describe, in simple terms, the song you want:

Using the simple mode to make foreign language pop music in suno.ai

However, I’ve found it even better when combined with another platform to customise the lyrics. I used ChatGPT to create the text first, here, manually tweaked a little, then pasted into suno’ai’s Custom Mode for the music:

Generating better song lyrics (without music) in ChatGPT
Using custom mode in suno.ai to make music with pre-written lyrics

Here’s my rather jolly track “Der fröhliche Gorilla” from the above prompt, complete with album art. Sound quality is middling right now, but it’s exciting to think how much this will probably improve over the coming year. The free account also tends to chop tracks off suddenly, but for a free resource, it’s pretty great!

With a pretty generous 10 free tracks a day up to 1:20 long, you can get a lot out of the free tier. I may well upgrade in any case, as it’s so much fun, and so useful, that I’d love some longer, more polished tracks.

Created any tracks you’re proud of in suno.ai? Please share them with us in the comments!

Language learning - making sense of the wall of words.

Language Learning Treats 2023 – for Christmas and Beyond!

That rolled around quickly again, didn’t it?

2023 has been a year of language ups and downs. Amidst some sadder news, like the mothballing of old courses, and language department struggles at leading universities, there was a lot to celebrate, too. AI has gone big in the language learning world, supporting learners everywhere for free. And the non-Duolingo crowd of apps has only got stronger (continual love to Duolingo when you control the owl, of course). Offerings like Lingvist and Lingodeer now give learners more choice than ever.

It’s all got me a bit nostalgic for my own year of language learning treats. I’ve enjoyed so much of what’s been on offer this year, free and otherwise. It’s only proven to me what a well-supported bunch we are in the polyglot world. And long may that continue.

Anyway, here are a few 2023 treats that were right up my street. I hope you like them, too!

Speak Gaelic!

Speak Gaelic learners have great cause for cheer this season. The BBC’s vast new offering for Scottish Gaelic learners has been a shot in the arm for learners of this beautiful, precious Celtic language, and goes from strength to strength. It’s filled a gap left by the equally excellent, but ageing Speaking Our Language, and it seems determined to build on that heritage in a big way.

We’ve not only got multiple series of CEFR-levelled TV programmes, but also an excellent activity website, a podcast, and now, a series of course books. They even manage to be entertaining, thanks to the infectious cheer of Joy Dunlop and humour of Gaelic’s social media man, Calum MacIlleathain. That’s no mean feat for a language course. Legends, the lot of them.

Even if you have a passing interest in Gaelic, check the series out. It’s a masterclass in how to support a learning community.

Éditions Ellipses

The French educational publisher Éditions Ellipses was my big surprise of 2023. Ever a fan of triangulating my languages, I happened upon their language learning catalogue in France this year. They cover over 20 of them, supporting grammar, vocabulary and cultural learning. Well worth a look if you have French and fancy using it to learn other languages.

I bought a couple of good ones in Lyon, but they’re also available on Amazon: I particularly rate Petites histoires pour apprendre le grec moderne if you’re working on Greek, and Vox allemand for more advanced Germanists.

AI Platforms (LLMs)

I alluded to it in the intro, and it’s impossible to discuss learning in 2023 without a mention for AI. That’s Large Language Models to you and I in the know, as they’re appropriately named – and they’re content whizzes, making for a perfect partner with language learning. I spent so much time bending them to my polyglot will this year, that I wrote a book on using them, AI for Language Learners. Obviously, I would well recommend that as a Christmas treat for any language lover! 😉

The greatest thing about AI for languages is that it’s free to build into your learning routine. Microsoft’s Bing chat is now available to all, and is as good as the best paid models right now. If you want to have a play, check out my articles on creating your own Assimil-style language learning texts and creating Anki decks using AI for starters. Once you get stuck in, you won’t be able to stop!

Language Learning : The Return of …

For me, 2023 continued the personal movie that is French : The Sequel. After abandoning French pretty much immediately after school, it’s slipped back into my life almost accidentally. For one reason or another (mainly music), I keep finding myself in France.

And it’s been a voyage (or three) of rediscovery.

It’s led, of course, to those Éditions Ellipses surprises in Parisian and Lyonnaise bookshops. It’s been such a pleasure, reconnecting… I’m not sure my French will ever be that good, but it’s fun trying! And it just confirms again that sometimes, you don’t always choose the language.

The language chooses you.

What have your 2023 language learning highlights been? Let us know in the comments!

 

Parallel text style learning, like Assimil courses, can be a great way to improve your fluency.

DIY Assimil : Parallel Text Learning with ChatGPT

Assimil language learning books are hugely popular in our polyglot community. And for good reason – many of us learn really effectively with its parallel text method.

They’re especially userful when the base language is another of our stronger languages, adding an element of triangulation. I learned a heap of Greek vocabulary from the French edition Le Grec sans Peine, at the same time as strengthening my (ever slightly wobbly) French.

Now, Assimil is already available in a great range of language pairs. But it’s not always a perfect fit. For example, some editions are more up-to-date than others. More off-the-beaten-track languages still aren’t available. And at times, you can’t find the right base language – no use learning Breton through French, if you don’t have any French.

Enter ChatGPT (or your alternative LLM of choiceBing also does a great job of these!).

DIY Assimil Prompting

Copy and paste this into your AI chat, changing the language (top), translation language (middle) and topic (bottom) to suit.

You are an expert creator of language learning resources. I want to create some text-based learning units for beginner Malay learners (level A0/A1 on the CEFR scale). The units follow the parallel text approach of the well-known Assimil language learning books.

Each unit has a text in the target language (about 250 words) on a specific vocabulary topic. It should be narrative, talking about how the topic relates to an everyday person. It should be divided into logical paragraphs. After each paragraph, there is an English translation of that paragraph in italics.

The text should be written in very clear, simple language. The language must read like a native speaker wrote it, and be error-free and natural-sounding. Source the info for the text from target language resources online, making it as up-to-date and authentic as possible. It should be completely original and not copied or lifted from any other source directly.

After the text, there is a glossary list of the key topic words from the text, sorted alphabetically and grouped by parts of speech (nouns, verbs, adjectives, adverbs etc.).

Are you ready to create some content? The first topic is: Mobile Technology

This prompt creates a prose-based parallel text unit. However, if you prefer dialogue-style texts, simply change the second paragraph of the prompt:

Each unit has a humorous dialogue in the target language (about 20 lines) on a specific vocabulary topic. The dialogue should relate the topic to everyday speakers through colloquial, idiomatic language.

The prompt works a treat in both ChatGPT Plus (paid) and Microsoft Bing (free). I also got very useable results in the free version of ChatGPT and Claude 2. It works so well as the focus is purely on what LLMs do best: spooling off creative text.

How Do I Use Them?

So, with your shiny, new Assimil-style units spooled off, what do you do with them?

Personally, I like to copy and paste the output into the notes app on my phone. That way, they make nice potted units to browse through when I have some spare moments on the bus or train. They’re equally handy copy-pasted into PDF documents that you can annotate on your phone or tablet.

Parallel text for Malay language learning created by AI

Parallel text in Malay and English created by AI

In terms of real-world use, the self-contained, chatty texts typically created make perfect material for the islands approach to improving spoken fluency. Create some units in topics that are likely to come up in conversation. Then, spend some time memorising the phrases by heart. You’ll be able to draw on them whenever you need in real-life conversation.

Enjoy prompts like these? Check out my book AI for Language Learners, which lists even more fun ways to get results without paying hefty course book price tags!

ChatGPT releases custom GPT models

ChatGPT, Your Way : Custom GPTs In The Wild!

This week saw one of the biggest recent developments in consumer AI. ChatGPT released GPTs – customisable AI bots – into the wild for Plus members, and the community has gone wild.

In a nutshell, GPTs are AI bots with custom behaviour that you define. And you define that behaviour using natural language, just like how you talk to regular ChatGPT.

Crucially, GPTs are shareable. So you can come up with a killer app idea, set it up in seconds, then share your creation with the world. Already, linguists and language lovers are sharing their creations on the socials.

ChatGPT for Worksheet Creation

Obviously, I couldn’t wait to get playing when the GPT creation tool went live this week. I’ve long been a cheerleader for topic-based units for independent study, especially when preparing for spoken lessons. So the first thing I coded up was a foreign language worksheet creator!

It’s the kind of thing I’ve been writing and sharing prompts about for a while, now. The big game-changer, of course, is that now, all that functionality is packaged up into a single, one-click module. Open it, tell it your language, topic and level, and watch it go. This will produce a range of resources and activities for independent learning, including a vocabulary list, reading comprehensions, and cloze quizzes.

Genuinely useful for self-study!

Foreign Language Worksheet Creator GPT in ChatGPT

Foreign Language Worksheet Creator GPT in ChatGPT

It’s already been a learning experience, for all of us tinkerers. For one thing, I found out not to overload it by trying to do too much at once, or turning on all its capabilities (browsing, code interpretation and image creation). I ended up with a uselessly slow initial version that I can no longer even reopen to edit.

Ah well – these things make us!

Old English Monkeys

When you do get a working version, however, you can boggle at the versatility of it. That’s thanks to the billions of training points backing up the platform. I asked it to create an Old English worksheet on the topic “Monkeys”, in the style of a Modern Languages worksheet, as a cheeky wee test. Admittedly, ChatGPT did say that it would be a challenging task. After all, just how many Old English documents do researchers train their LLMs on? But the results were really not bad at all…

An Old English worksheet in ChatGPT

An Old English worksheet in ChatGPT

 

I expect many of us are playing these games, pushing the new tech to see how far it can go. At the very least, we can all revisit those isolated prompt ideas we’ve been collecting over the past months, and turn them into shareable GPTs – for work and for fun.

Have you had chance to play yet? Share your proud creations with us in the comments!