Why would I like to be able to have multiple youtube links for my prods
category: general [glöplog]
interlacing is sooooo 1990!
Trying to show 50 Hz interlacing effects over Youtube on 60 Hz displays. And so that certain types of Youtube comments are avoided. Winner's choice! It works almost as well as wine tasting over Youtube.
Quote:
There's no reason to offer both HD and non-HD streaming on Netflix because clearly the intended audience of films goes to the cinema.
There's no reason to provide lossless formats on music download stores because clearly the intended audience of music buys vinyl.
There's no reason to release non-vendorlocked ebooks because clearly the intended audience of books prefers hardcover.
Very bad analogy, but I'll bite: what we have here is someone who provides a perfectly good lossless copy of their music, but because ~5% of the audience can't play it for whatever reason, they also want to offer a version that's downsampled to 96k real audio.
I would like to know where your 5% come from.
Quote:
It works almost as well as wine tasting over Youtube.
Better analogy: it works almost as well as drinking imperial stout out of a pilsner glass.
but... do i have to stream the youtube video over a CRT monitor too, otherwise i ruin the experience?
Gargaj, browser stats only tell you which browsers people use, not whether they can/do watch youtube in HD. Shocking, right?
To be honest, I am tired of this silliness. Can we please, pretty please just be able to sometimes have two videos for our prods? Why do you have to make much fuss over nothing? This is not disagreement that you are demonstrating. This is just being contrary for the sake of being contrary.
To be honest, I am tired of this silliness. Can we please, pretty please just be able to sometimes have two videos for our prods? Why do you have to make much fuss over nothing? This is not disagreement that you are demonstrating. This is just being contrary for the sake of being contrary.
Gargaj, I don't necessarily agree with that person, but the point I was trying to make is that I certainly disagree with the person who chips in saying that the lossless digital copy is useless to begin with and that only the Edison wax cylinder is worth a damn.
The issue isn't whether people can or can't watch HD. It's whether they are willing or not.
As for who's making the fuss, who opened the thread in the first place? :) I understand some of your reasoning, but I just don't think it merits the change you think it warrants.
As for who's making the fuss, who opened the thread in the first place? :) I understand some of your reasoning, but I just don't think it merits the change you think it warrants.
Quote:
but... do i have to stream the youtube video over a CRT monitor too, otherwise i ruin the experience?
I think we should have another YouTube capture which shows the video as it would look on a CRT. For those people who only have LCD displays connected to their YouTube-capable systems, but who still want the full experience.
Be sure to also capture the degaussing of the monitor, for full accuracy.
jobe: But I don't think that's the point of the thread. We already established that some people can't watch on original hardware and some don't know how to set up an emulator. The question here is, from my pesrpective, is that if we have a fairly exact video capture (50hz) on YouTube, do we need a link to a less-exact one if the lower resolution versions of the 50hz one don't look correct?
Quote:
certain types of Youtube comments are avoided
The best way to avoid those is to never publish anything.
Guys, it is now well established that many of you are extremely witty and in-your-face macho-style confident. With that out of the way, can we stop discussing all that jazz and go back to the essense of the thread.
1) Why gloperators are _really_ expected to monitor the number of youtube links?
2) Why are they so shit at it (I can provide examples or you can trust me on this for once)?
3) Why do we still discuss the alternatives if the simplest solution is staring you in your face?
1) Why gloperators are _really_ expected to monitor the number of youtube links?
2) Why are they so shit at it (I can provide examples or you can trust me on this for once)?
3) Why do we still discuss the alternatives if the simplest solution is staring you in your face?
Quote:
1) Why gloperators are _really_ expected to monitor the number of youtube links?
They're not, there's no rule about it. But if some links are superfluous, then there's no point in having them.
Quote:
2) Why are they so shit at it (I can provide examples or you can trust me on this for once)?
Everyone makes mistakes. Like using harsh language when you're not supposed to.
Quote:
3) Why do we still discuss the alternatives if the simplest solution is staring you in your face?
I'm all for simple solutions as long as they're also the right one. And hell maybe just having two YouTube links for the maybe 2-3 prods involved is a better short-term solution if it keeps the hearts and minds happy. But long-term, I just don't see the point.
Quote:
Guys, it is now well established that many of you are extremely witty and in-your-face macho-style confident
Quote:
Why are they so shit at it
The self-insight is clearly strong with this one.
I am sorry I said shit. I did not mean to say that gloperators do not know what they are doing, I actually implied exactly opposite. Given the amount of ridicule directed at me in this thread I cannot help but think that it has little to do with actual matter at hand but a lot to do with the platform I like to code for.
Long term I fully agree with you actually. But we live now and release now, not in the long term. In fact, in the long term links to youtube videos expire. Even prods get lost, nevermind their recordings.
Long term I fully agree with you actually. But we live now and release now, not in the long term. In fact, in the long term links to youtube videos expire. Even prods get lost, nevermind their recordings.
gloom, you are right, I should not have said this.
Quote:
Given the amount of ridicule directed at me in this thread
Quick solution: Skip reading the posts that are from people who are not responsible for the site's integrity.
Hey cool, so besides a troll and not knowing our audience I'm also shit at my job? Take a wild guess how much time I'm willing to spend on mr introspec's crappy videos from here on out :)
introspec: can you give a link to an example Youtube video of an interlaced color effect that looks close to the real thing when viewed on a normal 60 Hz display?
There's a similar problem with oldskool DOS demos, which are mostly 70 Hz, but that's practically impossible to have on anyone's actual setup nowadays. So the captures all look like crap, because motion isn't smooth and a lot of the magic is just not there.
There's a similar problem with oldskool DOS demos, which are mostly 70 Hz, but that's practically impossible to have on anyone's actual setup nowadays. So the captures all look like crap, because motion isn't smooth and a lot of the magic is just not there.
@yzi: Check out http://www.pouet.net/topic.php?post=500016
@havoc: You consistently choose what you like from my replies and ignore everything you don't. If you weren't so dismissive in the first place, this thread wound not have happened. So I am sorry I became careless in what I said, but this does not make you any more right.
@havoc: You consistently choose what you like from my replies and ignore everything you don't. If you weren't so dismissive in the first place, this thread wound not have happened. So I am sorry I became careless in what I said, but this does not make you any more right.
Thanks. I have a hard time believing that it could look like that on a real machine. I have to watch a proper full framerate recording when I can. I also don't have a Spectrum.
Great demo by the way, one of my favourites of the past year. But I hate to watch Youtube videos of oldskool demos, because I know that I'm not seeing the real thing, and I'm spoiling the whole prod. There can only be one first time, so it should be right. I have a backlog of stuff to watch properly. For example these big C64 demos, the authors put so much effort in them so why spoil it by making the first watch shit just because it's easy and comfortable? Watching the tube crapped versions feels like, I'm not respecting the authors and I'm not even respecting myself.
Could we have a SPOILER WARNING on youtube links, for prods that just don't work in youtube? ;)
I mean, do people really first watch a crapped and cut mobile version of a movie, and then go see it properly in a movie theater, after they already know the plot and everything? Hey it makes sense because it is so easy and comfortable to spoil it.
Great demo by the way, one of my favourites of the past year. But I hate to watch Youtube videos of oldskool demos, because I know that I'm not seeing the real thing, and I'm spoiling the whole prod. There can only be one first time, so it should be right. I have a backlog of stuff to watch properly. For example these big C64 demos, the authors put so much effort in them so why spoil it by making the first watch shit just because it's easy and comfortable? Watching the tube crapped versions feels like, I'm not respecting the authors and I'm not even respecting myself.
Could we have a SPOILER WARNING on youtube links, for prods that just don't work in youtube? ;)
I mean, do people really first watch a crapped and cut mobile version of a movie, and then go see it properly in a movie theater, after they already know the plot and everything? Hey it makes sense because it is so easy and comfortable to spoil it.
By watching a full framerate recording I mean, on a CRT. Or maybe Framemeister. But 50 Hz.
@yzi, I am 100% with you. But most of us do not have a library of hardware easily available, so compromises must be made. My demo will, of course, flicker more than blended recording suggests but it won't flicker nearly as much, as it seems to flicker in an emulator pushing 50Hz video onto 60Hz monitor.
A lot of people on Spectrum looked into flicker. Depending on colour combinations, and on specific visualization strategy, some of the flicker images can be extremely stable, so that the flicker become more of a sheen. But all these advances in visualization become irrelevant when people blend frames. So, complaints about "non-authenticity" of blended images are not a problem for me because I am afraid of criticism, but because their criticism is, ultimately, valid. One must watch demos with flicker - with flicker, to see these differences.
Some people suggested that we simply should stop using "interlace". Well, for starters, it has nothing to do with interlace on Spectrum, because the scanlines cannot be shifted with respect to each other. People on Spectrum call this kind of temporal blending "gigascreen". Well, I personally find this kind of argument ridiculous, because it tends to come from people who never saw gigascreen on the real hardware. Yes, it can sometimes flicker a lot, but it does not have to. And the argument that something so oldschool should not be done because we all have 60Hz monitors now is just utterly bizarre to hear at this site.
In addition to being bizzare, I happen to think that it is also very disingenuine. Check out the page with top ZX Spectrum demos. Out of 25 top prods according to Pouet thumb ups, 10 demos use some or other form of gigascreen technology. Almost a half.
The truth is, people like good colour a lot more than they hate flicker. So flicker-graphics is here to stay. Which I why I care about the best way of presenting it to people who are not in the know.
A lot of people on Spectrum looked into flicker. Depending on colour combinations, and on specific visualization strategy, some of the flicker images can be extremely stable, so that the flicker become more of a sheen. But all these advances in visualization become irrelevant when people blend frames. So, complaints about "non-authenticity" of blended images are not a problem for me because I am afraid of criticism, but because their criticism is, ultimately, valid. One must watch demos with flicker - with flicker, to see these differences.
Some people suggested that we simply should stop using "interlace". Well, for starters, it has nothing to do with interlace on Spectrum, because the scanlines cannot be shifted with respect to each other. People on Spectrum call this kind of temporal blending "gigascreen". Well, I personally find this kind of argument ridiculous, because it tends to come from people who never saw gigascreen on the real hardware. Yes, it can sometimes flicker a lot, but it does not have to. And the argument that something so oldschool should not be done because we all have 60Hz monitors now is just utterly bizarre to hear at this site.
In addition to being bizzare, I happen to think that it is also very disingenuine. Check out the page with top ZX Spectrum demos. Out of 25 top prods according to Pouet thumb ups, 10 demos use some or other form of gigascreen technology. Almost a half.
The truth is, people like good colour a lot more than they hate flicker. So flicker-graphics is here to stay. Which I why I care about the best way of presenting it to people who are not in the know.