ArcSoft Claims HD From SD With Nvidia CUDA

Status
Not open for further replies.

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Nomatter what technique you use you can`t turn SD material into HD ... SD PAL material has 720x576 pixels while 1080i has 1920×1080 no matter what filters you use you can`t "invent" out of nothing more than double the missing pixels.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
It's real, it just has unrealistic claims about enhancement. 720x480p will never become true 1920x1080p no matter how you slice it, but I'm sure the image quality certainly won't suffer from the post processing. Same with stand alone DVD components and receivers with the ability to upscale. PowerDVD has a similar feature called TruTheatre HD, the elimination of the 'e' at the end of words is becoming the new 'Extreme' or 'Turbo' or 'i'. It just makes it that much cooler.
 
G

Guest

Guest
Actually, there are lot of hidden information between pixels which CAN be used to extract resolution out of any given image. Every pixel contains information about every other pixel near it, and combined with decompressing of codecs the result can reveal/"invent" a lot more detail than the original picture actually has. It's sort of like filling in boxes in sudoku puzzles using logic and fuzzy guessing. This technique is different from up-scaling and it is called super-resolution[ing]. There is a lot of research going on in this field right now. That said, I don't know how effective Arcsoft's software is.
 

celebrity

Distinguished
Apr 2, 2009
1
0
18,510
My question exactly. But I think it depends on the codec involved and how much extra information is stored within the codec. I think that the newer codecs used for HD is much more compressed and less likely to be able to extrapolate the missing information. After all, why can you get almost DVD quality out of a file at times less than 1/8 the size when using DivX (H264, which is HD broadcast)
 
G

Guest

Guest
It might not entirely be impossible, however when applying the procedure it might be that the quality gained is unlike the original.
First there's linear and cubic interpolation;or perhaps quarter pixel data that can be extruded to a cluster of 4 pixels.

A more complex,and CPU intensive procedure might include extracting 720x480 to 1440x960. A cluster of 4 pixels will turn into 16 pixels.
Depending on the shape of the object, the GPU calculates how an object which crossed the 4 pixels will display on 16 pixels.
If it's crossing in a straight line, (or in a curved line) it will look much like enabling AA 2x in games. The extraction of 4 to 16 pixels might be more than simple upscaling happening all over the screen.

A third way this technology could work, is if the images processed are compressed with a lossy encoder like divx or xvid.
Both codecs can successfully identify cut foreground objects, sometimes store them as high quality Jpegs, and slide them across the screen,to eg show an animation of an object sliding across the screen.
The data they gather of these objects sometimes surpasses the quality or resolution of the display they are displayed at (eg as they move across the screen they are also scaled to smaller objects).
Instead of recalculating an object every frame,DivX and XviD can 'scale' that object larger or smaller,and keep the highest detailed version in the memory using that to display even at lower resolutions.
If the software is capable of calling these objects, it could load the data it has about this object into the memory, and perhaps display it in a slightly higher resolution as the original display resolution
(if you get what I mean).

Latter 2 examples are very uncommon and probably extremely high CPU/GPU intensive tasks. So I think this methode is nothing more than extracting quarter pixel data to 2x2 pixelarray, and using the simple AA 2x approach.
The original might look different from the upscaled version, especially if the image contains lots of data & detail!
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790

If you can't tell the difference between 480 lines (DVD, standard TV) and 720/1080 lines (HDTV, BluRay) you probably need your eyes checked. No reasonable amount of distance from my TV causes me to view them with equal clarity.
 
G

Guest

Guest
Seriously, do any of you actually have a up scaling DVD player, they are fairly successful at improving image quality on more recent DVD s, it's especially apparent if you have a larger TV. No, it won't look as good as 1080p, but on a smaller display, it will look good enough.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
Yes, and it does make a tremendous difference, but it is not 1080 lines. I'm not saying it's worthless, but the claims don't match the performance (as with most marketing).
 

pocketdrummer

Distinguished
Dec 1, 2007
1,084
30
19,310
[citation][nom]BigBag[/nom]Shouldn't we be able to turn HD content into super HD using a similar process then?[/citation]

Yep, it's the new UD... Uber-definition!
 

vaskodogama

Distinguished
Oct 3, 2008
114
0
18,680
[citation][nom]MTE[/nom]Seriously, do any of you actually have a up scaling DVD player, they are fairly successful at improving image quality on more recent DVD s, it's especially apparent if you have a larger TV. No, it won't look as good as 1080p, but on a smaller display, it will look good enough.[/citation]
no, the cheap SD TV's that are in market, have high pixel pitch ration, means their pixels are in highest distance from each other, so a big high quality, low pixel pitch ratio, and therefore high price and expensive SD TV, can show good pictures even in low distance when you sit near them.
and about this technology that ArcSoft Claims it will produce, I feel it's happening, cuz it needs CUDA, means heavy processing, so, eagerly waiting for it!
 
G

Guest

Guest
Tried it. Sadly doesn't live up to the hype. In some cases even degrades the video quality. Shame.
 
Status
Not open for further replies.

TRENDING THREADS