Decoding The Art of Subtitling
The first time I paid some real attention to subtitles was while watching Quentin Tarantino’s Inglourious Basterds – a movie where the characters spoke a blend of Italian, German, French, and English. Later, I did some digging into the ‘behind the scenes’ of the movie. To my surprise, many actors who spoke Italian, German, and French weren’t native speakers. A translator – Paloma Guridi – helped the actors with correct pronunciations. In Guridi’s words, this one was “a very multilingual shoot”.
The production did an even better job with the movie’s subtitles. For instance, Brad Pitt’s character pretends to know Italian and hence, he speaks with a heavy foreign accent. This accent was very well captured in the subtitles – when he said “Si, correcto”, the subtitle says “Yes, ‘er correct.” That’s how I knew he was putting up an act.
Had the production not put in so much effort into subtitles, they would have lost out on context for many viewers (like me) who aren’t native speakers of Italian, German, and French.
With people all around the world consuming culturally and linguistically diverse content, the importance of subtitling has increased in the past decade. But have you wondered what goes into writing subtitles? Let’s take a deep dive into the art of subtitling.
Benefits of Transcription
A transcriber listens to the audio in an audio or video file and then converts it into a written text document. Transcriptions are usually in the same language as the audio. Transcribers need to be native speakers of that language so they can pick out the sounds and accents specific to that language.
Transcription helps people access your video and audio files in a text format. The practice of transcribing is also widely used for maintaining a record of meetings or procedures. Many corporate firms, multinational and public companies transcribe their meetings to make them available to all employees, investors, and stakeholders.
Benefits of Captions
Captioning is the process of dividing the text of the transcript into time-coded sections so they can be displayed (typically) at the bottom of the screen while playing audio or video. The captions must match the timing of the audio to help the listener/viewers understand what’s going on. Captions are always in the same language as the audio.
While captions just display what’s being said, closed captions also describe the various sounds and activity in the audio/video. These help the deaf or the hard of hearing (DHH) consume the content easily. As per the 21st Century Communications and Video Accessibility Act in the US, all video programming on the television and the Internet is required to have closed captioning. This helps the DHH community in easily accessing all kinds of video content.
This year, 85% of Facebook videos were watched without the accompanying audio. If your video has people speaking or engaged in a conversation, captions become necessary. Viewers who watch your video with the sound off will then be able to read the captions and understand what’s happening in the video.
Benefits of Subtitles
Subtitles are usually meant for any audience which doesn’t speak the same language as the audio in a visual setting. It involves two steps – transcribing and translating. A subtitling expert would first transcribe the audio (in the source language) and then translate the text into the target language. Subtitling experts must be native speakers of the target language and have fluency in the source language. Like captions, subtitles are written in the form of time-coded sections.
When you produce audio or video content, you probably assume the fact that your listeners or viewers are fluent in the language of your content. This assumption can limit your audience and make it difficult for your non-native speaker audience to consume the content. And that’s why multilingual subtitles make your audio/video files accessible to people belonging to different linguistic groups.
Subtitles also serve a great SEO purpose. Search engines are designed to crawl text. So, Google crawls all the text that appears along with your video – title, keyword tags, description, etc. When you add a subtitle file (read: text) to your video, the search engines are more likely to show your video in the results of local searches in that language.
The Scope of Subtitling
Typically, millennials consume video content while multitasking – they may be watching a video while eating, waiting for the bus, taking the subway, etc. Consequently, they may miss out on some of the audio or choose not to play the audio in public where it could disturb others. In this case, subtitles help them catch up. Even while scrolling through their social media feeds, they have their sound off. If your video doesn’t have subtitles, it’s very likely they won’t pause to watch your video.
Add to this fact that a lot of OTT (over the top) platforms like Netflix, Amazon Prime Video, and Hotstar are aiming to target non-native speakers with multilingual subtitles. For instance, Netflix’s popular show Narcos – originally shot in Spanish – was subtitled in multiple languages, including Hindi and English. Another show called Sacred Games was subtitled in 24 languages. As a result, two out of three Netflix subscribers who watched the show were from outside India. Also, as any person involved in the media industry will tell you, subtitling is a cost-effective alternative to dubbing.
Whether your brand’s into finance or eLearning, subtitling is essential for effectively marketing your audio and video content. In an age where people appreciate local and regional content, subtitling helps you make video content relevant to anybody who happens to watch it. In the next part of this series, learn about the 5 industries that are winning over their audiences with subtitles.