Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

community-contributed subtitles are an essential part of the appeal of YouTube for me, particularly with non-English videos from small operations who may not have the resources to provide quality transcriptions/translations. I don't think this functionality was as little-used as the support post represents. Clearly more options exist for managing spam than just shutting down the feature altogether.


> I don't think this functionality was as little-used as the support post represents.

I've used it a little bit, and been watching a bunch of content that relied on it.

That said, creating subtitles with their tools was an absolute shit experience. I've done my fair share of subtitling using other programs, and it's clear that whoever made the interface on YT hasn't really used the real deal, and doesn't understand what's important.

Because the most important part for getting good subtitles is to get the timing right, to make sure every subtitle is displayed for the minimum time needed, and to make sure you split the dialogue in the right places, so that reading them "flows", so that they match what's being said. And you want to tie them to scene changes, to keyframes, so you don't get weird blinking, and for that you need the ability to step through the video frame by frame and adjust the subtitles.

But the YT subtitling tools emphasized the text, and translations of the text. It was easy to suggest changes to a piece of text and get that through the "community" process, but changes to the timing was super-hard to do in their tools, and really hard to get through. You can upload a .srt that you've created in a real subtitling program, but that completely overwrites the existing subtitles, and was impossible to "diff" with what was already there, so that shit gets rejected as a rule, because reviewers don't understand the changes.

So the result is that whoever manages to make the first community translation that is accepted, that person's timing is then taken as canon by everyone else, and all other translations are based on those exact timing, never mind if they're good, never mind if they're a good fit for every language, never mind if someone else can do better timings.

And that completely sucks the joy out of the process, because you can't really improve existing subtitles, you can only try to change a word here and there.


Sounds like they took the wrong message from seeing that it wasn't being used as much as they were hoping. Are there any other video hosting sites with a better subtitling experience? This might be an opportunity for them.


The took the right message... for Google. Underinvest in tools, set up metrics that show that users don't use them, present those metrics as proof that "users don't need these features", remove features.


Could you recommend a tool that does subtitles and generates a .srt afterwards, from your experience?

I have a friend that translates everything from English for his grandmother, so she can watch movies that she does not understand, but he does it by hand in a video editing program. Would love to help him out.


I've tested a couple of the popular free ones, and settled on Aegisub. The default keybindings drove me nuts, they're not very ergonomic, but at least you can fix that easily.

.srt is the lowest common denominator for subtitle files, every subtitling program can import and export those. It's a pretty dumb format though, my biggest issue with it is that it doesn't have support for fixed-position left-aligned subtitles, and it doesn't have support for specifying the font. I understand why, it's to make playback much simpler and resolution-independent, which is why every single video player program or hardware device in existence has support for it.

But if you left-align it with a fixed margin, choose a good font, and use white text on black background, your subtitles are going to be so much easier to read.

(Here's a sample of what I prefer: https://skamenglishsubs.tumblr.com/post/188037245260/subtext...)


Isn't that left aligned? Am I missing something?


Crap, I meant left-aligned.

(I belong to the tiny minority of people who often mix up the words for left and right, it's both annoying and hilarious.)


Personally, I find the default keybinding for subtitle timing to be the best keybinding. But it's kinda not newcomer friendly.



Anecdote: I had a terrible time trying to use AegosSub, but Subtitle Edit is a total breeze: https://www.nikse.dk/SubtitleEdit/


Is i18n deteriorating at a global scale?

I’m seeing more and more nonsense machine translations and systemic errors in manual translations over few years in my language. Once I’ve seen a story about a corporate turning down “should” and “should not” swapped in documentation as a non reproducible issue, other times I see semi-sensical expressions that has 1:n relationships between English and Translated that needs context to select but randomly thrown around, probably as a best effort from translators.

e.g. [“X had occurred”, “Please do X now”, “Use this to do X”, “Choose which you want for X”, “Do X for this event”]

Microsoft used to be great in this regard in 2000s but now feel like I’m back at when gcc was telling me “$DIRNAME am directory entering”


I don't feel that it's deteriorating as much as that tech in general never properly understood how to do good localisation.

There was a time when many apps I downloaded were apparently machine-translated in a very bad way, so much that it was almost impossible to understand what was meant. That hasn't occurred to me in a while now, which means that I either download higher quality apps, google has stopped pushing the autotranslation feature or people have naturally migrated away from it...

in general, I feel that tech, maybe due to being so overwhelmingly from the US, has very poor support for things like multilingualism, which is in fact more common than not across the world (the US being an outlier traditionally - and even there, I think the influence of Spanish is growing).

For example:

- On some streaming/movie purchasing services, it can be hard to get a movie in the original version and not a localised one

- It's impossible on Android to have different apps use different languages (unless the app itself allows for it), which would not only fix the issue mentioned above with the badly translated apps, but also be really helpful e.g. for language learners

- It took Google Maps years to add a feature where, if you start typing a street name and it suggests a street, it gives you the option of directly filling in the street number too (e.g., I'm typing "Foob" and it suggests "Foobarstraße, 11111 Berlin", but giving me the option to directly type the street number before the comma). My hypothesis for why this took so long to add is that people from the US were totally oblivious to the need for this feature, since in the US, the street number comes before the street name and people could just type "123 Foob" and get the suggestion for the full address

- There is simply no way in the Play Store (and I believe in the App Store it's similar?) to see reviews in another language than the one from your store. This makes no sense for me, for many apps there are very few if any German reviews, but I'd still like to see English ones. I think it's even worse for app developers, although maybe they have some separate way of seeing that? Amazon doesn't have that problem btw.

- Also, a pet peeve of mine: using country flags for languages. Yeah, nope.

and so on ...


> I don't feel that it's deteriorating as much as that tech in general never properly understood how to do good localisation.

Absolutely not. I saw the i18n/l10n process in practice when I was contributing code to KDE. It was incredibly thorough and well thought-out (I actually learnt most of what I know about i18n/l10n at that time). Not just translating strings verbatim, but stuff like different languages having different plural forms (so you might need to translate "users" differently depending on if you're talking about 2 users or 3 users).

This is not rocket science. We know how to do it. It's just that most businesses don't give a shit. English gets you far enough in terms of adoption in most markets that you don't really have to care about l10n unless you have to fulfil legal requirements. (Also, some markets have bad rates of English literacy, e.g. China, but those are usually served by local app providers.)

> On some streaming/movie purchasing services, it can be hard to get a movie in the original version and not a localised one

Where I live (Germany), this has gotten way better over the years. Around ten years ago, cinema chains started offering screenings in original language (i.e. English) for the more popular movies. And cable TV started showing shows undubbed as well. The first thing I can remember there was Game of Thrones airing undubbed on the same day as the US release. I think the major reason was that piracy sites allowed users to access undubbed content easily. If you have the choice of watching the new GoT episode right now or waiting a year for the dub, most people are going to go with piracy. TV/cinema execs saw this and realized that there was a market to tap into.

Having access to undubbed content was actually quite eye-opening to me. Having only had contact with dubs up until that point, I only then realized how eye-wateringly shitty German dubs are. It appears to me like German dubbers don't really consider themselves voice actors (emphasis on the "actor" part). Sometimes it's like they think they're reading a newscast when it's actually an action scene.


You're not contradicting me It's not that we don't have the technology to do proper localisation, it's that the tech industry is oblivious to the needs of non-English speakers and in particular multilingual users. There is this assumption that 1 person = 1 language which is just wrong in many parts of the world.


My disagreement is that, in the phrase

> tech in general never properly understood how to do good localisation

you're using "understood" when it should actually be "cared about" which is substantially different. Also,

> the tech industry is oblivious to the needs of non-English speakers and in particular multilingual users

I'm part of the tech industry and at the same time a non-English speaker and multilingual users, and I'm not oblivious to my own needs. The problem is not that tech people don't understand, it's that business decisions don't take multilingual users into account.

This differentiation is important. Rephrased like this, it becomes apparent that this is a matter of policy, not literacy. It becomes possible to imagine (though I'm not arguing this) a scenario in which apps with a sufficient number of users could be required by law to accommodate multilingual users.


> I don't feel that it's deteriorating as much as that tech in general never properly understood how to do good localisation.

Yes and no. The majority of my experience with localization is via WordPress. Frankly, it's a PITA. Another thing to know, and so on. It's not an extension so to speak, it's another mountain to climb, another silo to wrestle with.

Furthermore, and yes this is unique to WP, it make no effort to leverage it's scale. Certainly, once (e.g.) 'Add to Cart' has been translated and vetted, it doesn't need to be done again. You should be able to submit your language file, have it parsed and spit back fleshed out as much as possible. Then you need only to focus on the bits that didn't find a match.

Yes. Some increase in understanding is in order. But an updating and upgrading of the tool(s) is over due as well.


Even trying to use popular software in a relatively big language like Spanish still inevitably ends up producing lots of untranslated strings (or worse, semi-translated nonsense).

I always set my software to en-us, even though that mightn't be my preferred language or dialect, because it's the only way I can be sure the developers actually checked it.


I've seen that too and I simply don't understand that. Why do some people half-translate an app? Did they just hardcode some of the copy by accident?


They translated it once years ago, likely using external resources. My team has had this issue; we made internationalized versions of our website once, years ago, and it was quite expensive and used external language contractors. Then the i18n versions gradually fell out of date as we continually updated the English version (because that's what we full-time devs speak). Eventually, years later, the i18n versions were hopelessly out of date and turned down, because it wasn't worth paying for another round of internationalization on them given how little it turns out they were used.


But if you had put yourselves in a position to accept community contributed translations, this would not have happened, right?


How is a random static marketing website for a corporation supposed to put itself into a position where it accepts community translations? How would you vet said translations before going live with them (which would be mandatory) given that you don't speak the language?

And who in the world would volunteer their labor for free to do said translations?


It was intended to be a strange mixture of snark, cynicism and sarcasm connecting your described situation with what Google is doing, with the hope of illuminating the different ways people might think about their own work.

In your case, describing it as "a random static marketing website for a corporation" more or less shuts down the discussion.

In Google's case, while they probably don't see YT in those same terms, the convergence of the approach towards i18n suggests that maybe they're a little closer to it than they were.


The context of this particular thread, though, is about why the user interface in an app might not be translated well. I responded by way of example explaining how translations can fall behind in a very similar situation, that of a website.

Neither of these are the same as the YouTube situation, because in the YouTube situation you're dealing with user-contributed content being translated, which is a crucial difference. Community contributions were never about translating the YouTube UI (or god forbid privacy policy, ToS, etc.); they were only ever about users submitting translations of other users' content.


Maybe they fully translated an earlier version of the app but more strings were added in subsequent versions and not translated. When there's no translation available for a string, it falls back to English.


Because either you have to pay some native a decent money to make a translation or because some strings are embedded in code?


Google Translate used to have instant translation from a photo for more languages and then they turned them off, no reason why.

Also you get tremendously worse translations when you translate a document (pdf, doc) vs the "scan/import" an image of text function.

It's a real bummer because hypertext and mobile UI should be excellent mediums for presenting multiple candidate translations and letting the reader indicate the best translation.


Auto translation is probably a big deal for yt/google long-term. If they maintain/achieve dominance in this field they're practically the only one to have detailed insight and anlysis about video content. Fostering an alternative market would be negative and cause more uproar as soon they expand their auto translation services.


I don't see auto translation being anything else but terrible, for many many years still. Possibly forever. I just don't see a machine being able to handle the culture, tradition or customs that are so ingrained in language.

English to danish translations are universally awful. Barely comprehensible gibberish(¤). So I use English operating systems with locale set to en_DK, so dates display as D-M-Y like God intended, but a surprisingly large amount of software somehow thinks it knows better, and displays its UI in danish anyway, so I get to have a brain aneurysm while trying to parse their danish translation for "anisotropic filtering".

(¤) Pre-emptive snarky comment: "Exactly like spoken danish, LOL".


I don't feel this way at all. Videos have automatic subtitles which you can automatically translate into the language of your choice, speech recognition is so good that the right tool will let you program with it, text to speech is a button-click away for basically any Web page (all I need is an extension). Post-processing for color blindness is amazing, left-to-right languages render readably on a consistent basis. OCR is progressing dramatically and we're starting to see projects focused at individual users, and automatic image tagging gives textual descriptions of a huge amount of picture content.

We're at a point where a lot of these tools haven't matured in their consumer implementations, but that's coming. It's just a matter of time.

That's all ignoring the soft accessibility of things like iPads that have made computing accessible to Grandma.


Automatic subtitles for videos in a different language are basically a joke currently.

I agree that we're progressing fast, but fully automated machine translation is IMHO still lightyears away (if at all feasible). And to automate subtitle generation in a foreign language, you first need to have speech to text, which is also still error-prone, so now you have two sources of errors.

We're seeing the uncanny valley problem: By now, things like machine translation are so good for simple use cases, that they're being aggressively pushed, and at first it may even appear correct / as if it was done by a human, but then suddenly the translation becomes nonsensical and weird. Even for the well-received deepl, it's still surprisingly easy to give it some text that it really struggles with.

Incidentally, I remember attending a lecture about 12 years ago by the then new professor of NLP who was talking about his success with using machine aided human translation of subtitles from Swedish into Norwegian. Granted, a lot may have improved in 12 years, but it still struck me as impressive that even in languages that closely related, the best they could hope for in a research project was machine aided translation.


Machine translation can never replace real translators, unless we develop an AI with actual understanding.

Even with human-translated texts it's usually noticeable when the translator didn't understand the subject. To make sense of the translated text you then have to try to reverse-engineer the translator's mapping to figure out what the text would have said in the original.

Much like how you can't properly parse HTML using only regular expressions and string substitution, you can't truly translate human languages without understanding. You have to parse the input language, process the meaning of what was said and finally serialize to the target language.


Subtitling adds even more issues that machine translation simply can't handle, because like a good book translation, it's an artform.

Making good subtitles means you prioritize readability over accuracy. You have a limited amount of space for your text, and you want to keep a low characters per second, so you cut words, ruthlessly. But you have to choose which words to cut so that it still makes sense, which means that you have to identify filler words so you can cut them, or figure out ways to re-phrase something into a shorter sentence.

You probably also want to preserve the tone and style of the dialogue, which means you have to choose the right synonyms, not just the most common ones.

And if you're creating hearing-impaired subtitles, it becomes even more necessary to understand what's going on in the video. If someone slams a door center-screen, you can cut that from the subtitles if you have more important things to display, but if someone slams a door off-screen, you absolutely have to include it in the subtitles, because that's the kind of information a hearing-impaired person needs.

Good luck training your little machine-learning network how to identify which sound effects originate from objects on-screen and which originate off-screen...


I agree in the general sense. The problem is that good human translation works as follows: The translator reads the text, decodes this into some mental representation, and then encodes that representation in the target language. Both decoding and encoding are also highly subjective (which is why works of literature can be translated in many different ways, see e.g. all the translations of works like the Bible, the Odyssey, etc.).

Machine translation still works by a straightforward source-to-target mapping. This assumes that there is somehow a 1:1 correspondence between concepts in one language and concepts in the other one.

There are some cases where this can yield OK results: when the languages are very closely related and/or if the material is very technical (e.g. instruction manuals), because in such cases, the concepts do tend to align a bit better.

But in general, I think the problem is intractable without solving general AI.


> left-to-right languages render readably on a consistent basis

‫os epoh dluohs I.


> I don’t think this functionality was as little used

Based on what? Because you used this feature a few times, it became widely used? We have no idea if this feature was used by more or less than 1 channel in a thousand, or these accounted for more than 1 view in a thousand.

> Clearly more options exist for managing spam

Yeah, I’ve heard this one before from well meaning people who start out with “why don’t you just ...” without realising that the approach would have poor precision/recall at scale. Any hard coded rule would probably rot. Building a classifier to detect this abuse would be tricky considering it’s low prevalence and that ML was doing a poor job of captioning in the first place (nothing to compare it to).

Another day, another top HN comment that confidently presents opinion as fact. Would it kill folks to be a little less confident?


> We have no idea if this feature was used by more or less than 1 channel in a thousand, or these accounted for more than 1 view in a thousand.

If you really take all of YouTube into account then yes, the actual number was probably very small, considering how many cat videos, fail compilations, music videos, wedding videos etc. there are. "Last Christmas" doesn't need Cantonese subtitles but surely makes up for a lot of views.

I'm subscribed to about 100 channels with many of them making high quality videos about different topics that required research, have animations for explanation or otherwise took effort to make. These often times do have subtitles in different languages and I'd consider that pretty valuable. Throwing those in a bucket with TikTok compilations when evaluating the usage of community translations or subtitles in general is just nonsense.


Clearly they need some method of prioritizing their work. HN would have you believe that every product and every feature should be supported till the end of time, regardless of whether it's used or not. In practice, in the real world, features that have few takers are removed because the maintenance burden doesn't justify the benefit.

> I'm subscribed to about 100 channels

How often have you actually relied on community generated subtitles? Note that even if you used subtitles, those could have been auto-generated.


> How often have you actually relied on community generated subtitles?

Almost never, since all the content I consume is in English. But several channels made posts about that upcoming change and there was quite some feedback by people depending on this (as far as I could tell, especially the Spanish speaking community).

> In practice, in the real world, features that have few takers are removed because the maintenance burden doesn't justify the benefit.

By that measure, traditional TV stations better scrap subtitles too, since the number of viewers actually relying on them is a minority, and maintaining it probably takes some effort too.

I think community generated subtitles, just like regular subtitles on TV, enable people to access information (or entertainment) they otherwise couldn't. There should be a better measure for its value than just how much effort it takes to maintain that functionality vs the number of users, otherwise there would be little reason for any kind of barrier-free technology or efforts really.


So just to be clear, any project that improves accessibility can never be shut down for any reason under any circumstances? That's a pretty hard stance to take.

> By that measure, traditional TV stations better scrap subtitles too,

You made an implicit assumptions that TV subtitles and Youtube community contributed subtitles are used by the same proportion of people. That's almost certainly wrong. And remember, Youtube auto generated subtitles still exist for all videos.

Look I don't work for Google, but it pains me when I see a thread full of people shitting on them without any basis in fact.

Here's a radical idea - we trust the people working on these things to take a call on it.


> So just to be clear, any project that improves accessibility can never be shut down for any reason under any circumstances? That's a pretty hard stance to take.

You make it sound like this feature costs a significant amount of resources and maintenance work. It's simple brokerage between users creating subtitles and creators assigning them to their videos. And then you mention auto-generated subtitles like this is something trivial that just works. Compared to everything else that is required to run a platform like YouTube, community generated subtitles pale in comparison.

> Here's a radical idea - we trust the people working on these things to take a call on it.

Yes, because when didn't profit oriented companies only want the best for mankind? Never did the quality of a product suffer because corners were cut in order to save a few cents during production. Trusting a company like Google. A radical idea indeed.


> You make it sound like this feature costs a significant amount of resources and maintenance work. It's simple brokerage

Yes. This right here. This is typical HN. You have absolutely no idea about what it takes to build or police this feature. You have no data about how much this feature is used and abused and by who. Without knowing anything you are confidently asserting that it costs very little to maintain this feature.

I'd ask you to reconsider this approach but tbh, this is the easiest way to farm upvotes on HN. So you do you.

> Yes, because when didn't profit oriented companies only want the best for mankind

I trust them a lot more than people who speak authoritatively while knowing very little.


> Yes. This right here. This is typical HN. You have absolutely no idea about what it takes to build or police this feature. You have no data about how much this feature is used and abused and by who.

As someone working in the field it's at least possible to make an educated guess about such things.

> Without knowing anything you are confidently asserting that it costs very little to maintain this feature.

No. You are primarily coming up with phrases like this, like "typical for HN" above to make it sound like everyone complaining is a pleb with no clue and you are far superior. You then go on to claim the reason people do this is to "farm upvotes" as a blanket invalidation, instead of contributing anything of substance.

> I trust them a lot more than people who speak authoritatively while knowing very little

I initially criticized that you suggested going by share of users when judging the usefulness of this feature, by comparing it to CC on TV and similar technologies. I tried reasoning why I believe this is an important and valuable feature that should not be removed. Only in my third comment did I mention that I can't imagine that it takes too much work to maintain this feature. But you immediately jumped at it, screaming THIS!! and continued your arrogant ramble about stupid HNers. The only one in this whole comment thread coming across as authoritative is you.

> I'd ask you to reconsider this approach [...] So you do you.

Sometimes it's best to follow your own advice.

Over and out.


> How often have you actually relied on community generated subtitles?

I suspect the answer to this question is entirely dependent on whether you speak English. If you don't speak English (and the person you're responding to obviously does), then you're reliant on subtitles regardless of source, unless you only stick to videos in your native language.


> Another day, another top HN comment that confidently presents opinion as fact. Would it kill folks to be a little less confident?

Apart from the last statement "Clearly more options exist for managing spam than just shutting down the feature altogether." the person you replied you was obviously stating their opinions, and not claiming them to be facts.

> "an essential part of the appeal of YouTube for me"

> I don't think this functionality was as little-used

Emphasis mine, in both cases.

As for the final statement, which is presented as fact, I think it probably is factually accurate that there are more possible options for YouTube than shutting the feature down.


"little used" is a stupid argument from YT. It may be little used on a global scale but vital to some specific groups.

By this reasonning all that will be left on YT would be music and cat videos.


Every feature is useful for some specific group out there. Doesn't mean it needs to remain supported. Here's an example - Youtube used to allow clickable links inside the video. Now it doesn't. That affected many channels, especially those that implemented their own "more like this" feature manually.

You have to make hard choices sometimes. If this is a feature that only a tiny minority cares about, then the team has to pull the plug on it.


With subtitles, the obvious solution is to not show them by default, even if the user has the corresponding language selected by default for subtitles in general. The net result would be the same as with this change, except that those of us who rely on community-submitted subtitles would still be able to use them (at the risk of seeing spam occasionally - but it's better than no risk and no feature).


I think the point is that google wants to save resources aka. put employees to work on something else.


Amen. How can you make claims like that so confidently? How can you think you know better about something than the team that built it and/or works on it for a living? Imagine someone making these claims about a product you work on. "You should add feature X. Everyone will use it." No, we've tried that beforę and only a tiny subset of users actually used it and it's a massive bitch to maintain.


yep, I see this feature being actively used on many of the channels i follow and I see it being appreciated frequently (and asked for when not enabled). This is a good example of inclusiveness in action; many people are able to appreciate content that otherwise may not be compelling to them due to language and ability differences. really sad to see it scrapped


I think it'd be helpful with community captions were more discoverable. I have no where of checking if the captions on a video are machine translated or human translated without going into sub menus. If the product of the community members' work was more apparent, the contributors would obviously feel like their labour had a bigger impact and want to create more captions.


The problem is if >50% of your videos are content less garbage and another quarter is from large networks and a large portion of the rest is in English it's easy to see how such an misconception can come into existence.

(Note: Ratios are poorly guessed based in my experience).


There was no option to disable them, and if you are multilingual it's a PITA. Sometimes I became so annoyed that I closed YT.

It's not the same effort for my brain to keep up listening to spanish, english, french or portuguese. And I WANT TO KNOW what I'm getting into. If I want subtitles or translations I'll activate it myself, thanks.

Sometimes I browse for a topic and I need a native POV out of it, but it became so difficult because YT just treats you like if you were stupid so yo loose time going back and forth.

IDK, maybe there was some option to manage it, but it was very well hidden in menus that I couldn't find it.


Toolbar in the bottom right, subtitle icon, disable.


That's only for subtitles, and somehow if you go into another video, you have to do it again.


Automatically generated subs are good enough in most cases... Also, it is a very underused feature, you turn them on either because you have some hearing issues, or because you are watching something in a foreign language, both are probably pretty rare


My wife is a native Thai speaker, fluent in English, but if the speaker is speaking quickly and not clearly, it's very frustrating for her to follow along (especially if the speakers themselves are non-native speakers, or speaking with an Australian accent). Most of the automated subtitles on Youtube seem heavily biased toward high-school level vocabulary, so if the speaker is using more rare words, making literary references, or using some of the more common Latin / Greek / French phrases imported into English (medical/technical terminology, etc.), the subtitles can be very misleading. The automated subtitles are also generally garbage at catching proper nouns, often replacing them with rhyming phrases of common words.

At least the garbage is pretty consistent, so I can pause the videa and tell my wife that the phrase X Y Z in the subtitles is actually A B. (The number of syllables is almost always correct, but often the number of words is not.)

On a side note, it's not YouTube, and it's translation instead of strait subtitling, but I've seen some pretty bad English subtitles for Netflix's La Casa de Papel (Money Heist). I don't know a lot of Spanish, but I do remember a few times the translations were very odd and I realized the translator was translating a person's surname from Spanish into its English meaning, and not capitalizing it. It would have been just fine leaving the name untranslated, as my wife and I could both clearly make out the names of the characters. I presume a human translator would know not to translate names. I hope the subtitles I saw were third-party subtitles where someone ran Spanish subtitles through Google Translate.


Why do you think it's rare to want to watch something in a foreign language?

For the vast majority of people on this planet, English is not their first language (if they speak it al all), and their command of English may often not be good enough to comfortably understand all of the content they might want to enjoy. And it's still the case that most content on such platforms, and certainly often the most viral one, is in English.

And even beyond that, people sometimes learn other languages, in which case watching something in the target language with subtitles can be a very helpful step.


If you're american or british, maybe yeah. That leaves aside all the people who do not speak it as a main language, those who use the captions to gather additional context from translators, those who cannot watch the video with sounds for any reason... The list is endless.

Oh and, the automatic subs are absolute trash if whoever is speaking is not doing so with an american accent, in a perfectly clear room.


If you speak multiple languages (which isn't uncommon out of the anglosphere) it becomes very annoying to have YT putting subtitles and translating stuff for you.

If I want subtitles or translation I should be the one deciding, or at least give an option to opt-out, which isn't doing it every video. And that was for subtitles, because that wasn't possible with titles and descriptions.


The automated video title translation "feature" is really fast turning me off YouTube. It's just confusing as hell to see a German title and then realise the video is actually in English.


English is my first language and I don't want automatic subtitles or translation for stuff in other languages.


To be honest, YouTube’s captioning struggles with even British accents.


Another use case is when you want to watch a video without the audio to avoid bothering the people around.

Auto closed captions quality depend on the language and the locutor. It works well for "presentation" content like vlogs and news, it fails for casual discussion or songs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: