The Bug 720p
I went through it.What puzzles me is, Why does it behave that way SOMETIMES even though my project workflow is exactly the same for every video I work with. My first file I open in Shotcut is always the 720p Footage.
The Bug 720p
Hudson555x, ThanksI will pay more attention here. I have 2 videos to edit today. Will keep you updated.However, My Export file always says 720p, which is the Source footage resolution, which suggests Video mode was set to automatic and hence was pulled from source footage file. Hence it raises a doubt that this might be a bug.
720p, also known as HD (high definition), is a display resolution measuring 1280 x 720 pixels. Resolution explains how many pixels a display has in width x height format (the more pixels a display has, the sharper its image quality).
720p compares favorably to the old Standard Definition (SD), which is usually around 640 x 480. But with 2K (opens in new tab)and 4K getting more popular, 720p isn't considered very sharp for today's PC monitors (especially not the best gaming monitors), gaming laptops (opens in new tab)or TVs. If you're buying a monitor (opens in new tab) or shopping for laptops (opens in new tab), don't settle for anything less than 1080p.
When the Source Filmmaker exports your movie, its default is to render at 720p. This means that your movie will be 1280 pixels wide by 720 pixels high, which is an average resolution for most movies made with the SFM. If you want your movie to have a higher resolution, you can instead render at 1080p, which means a movie that is 1920 pixels wide by 1080 pixels high.
It boggles my mind: why do flat panel TVs come in 1366x768 and worse yet 1024x768. The first one is probably no better than native 720p displays even if the video source has more resolution than 720p due to the quality/sharpness lost in interpolating. The second one is even worse because it has to also interpolate for the non-square pixels. Are there any real advantages to these resolutions over the two native HD resolutions (720p and 1080p)? And also, is the interpolation done hardware or software based? Is it simple nearest neighbour (which produces poor results) or something more advanced, like bicubic or Lanczos3 (which is very computationaly intensive for HD video in real time)? Roberto75780 (talk) 09:59, 4 January 2010 (UTC)Reply[reply]
As part of my move, I created a new home theater. Some of the equipment is still on order but, by the time it all arrives, I should have something more impressive than what I had before. The size of the screen will decrease (from 65" to 52") but the overall quality of the video will be better. However, this raised an unsettling question for me: What good is a beautiful 52" 1080p LCD if there's no 1080p content to play on it? My current plan is not to hook up a cable signal but, even if I did, the best cable can offer is 720p, which isn't what I'm looking for. An upconverting DVD player will output at 1080p, but the source material is still 480p and looks like it.
A quick perusal of the Black Friday circulars indicated that an HD-DVD player could be had for around $150. Included in that package was an offer for five free discs (picking from a pool of largely unimpressive titles, however). At first, this appeared to be one of those deals that was too good to turn down. But, as always with that sort of thing, there was a big, nasty catch. The cheap HD-DVD players were the old ones with a maximum output of 720p/1080i. Not good enough. Not something I'd spend any money on, even if it's only $150.
I want to add a clarification here. I chose Blu-Ray because it fits my needs best, both in terms of economics and because there are more Blu-Ray titles that interest me than HD-DVD titles. This doesn't mean that Blu-Ray is "better" or that HD-DVD is "worse." For those who don't care about 1080p and are happy with 720p or 1080i, HD-DVD is clearly the better economic choice.