9 days?!
We might need holodeck technology for that to become a reality.
Shot/scene backgrounds can be somewhat expanded like that with real footage from nearby frames if there is camera or zoom movement. BorisFx Mocha Pro for example - there tend to still be some missing bits but there would be much less work for the AI to do because it would have more than the original 4:3 frames to work with (the totality of the background in the entire shot).
Trouble is that those pesky directors tend to do cuts/shots every 4 or 5 seconds in some genres and not a lot longer in most others so your AI might have to work on each or most of hundreds or thousands of short clips and try to find consistency between clips that often isnāt there. Heck, AI is already inconsistent at duplication even when everything ārealā is identical. AI would do the invented bits of the same backgrounds differently in different clips, and weād notice.
But it should be possible to fill the missing parts of an expanded (4:3>16:9 etc) static background in say a 5 second clip, probably at lowish resolution, I guess. Making it look the same in other shots, or for animate objects including people, I suspect wonāt happen though, not in the foreseeable future anyway.
I donāt think youāre likely to see much effort toward that goal, since original aspect ratio and black bars became the SOP over stretched widescreen way back in the early days of DVDs.
Itās not a stupid idea . But unfortunately it doesnāt yet exist in reality.
Maybe itāll come one day. Maybe in 10 or 20 years. But to do it right, youād need a really special model created for Star Trek series in 4:3 so that it can analyze every sequence in every episode. That way, it could better determine the fill areas for 16:9, based as you say on camera movements and the neighboring images. And keep the whole thing coherent.
Aion always failing with an error within a few minutes of beginning a conversion. Not enough VRAM on a 3070 to use this model Iām assuming?
I have been struggling with this problem for a long timeāI explained all my latest findings in the last post of the thread I opened some time ago related to the problem:
Are there any plans of supporting Prores 4444 output?
Thatās right!
Thatās what happens when Aion has such strict memory requirements that I had to use the CPU.
The first round did not produce a file for a reason I donāt know, so after updating my OS and program, I am confirming every 12 hours that the output file is growing.
P.S. Wish partial files were playable, Iām pretty sure mpv and VLC have the functionality but it doesnāt work
Oh no! If I can help you out processing for you donāt hesitate to ping me.
I think itās exclusive to Mac. I donāt really know, but I think I see someone say that about once a week on here.
Even when I donāt use mark-in and mark-out, Iām getting Aion failing in the same way on my 3070 8GB.
Iām trying to take a 720p 30fps file to 1080p 60fps with Proteus and Aion, so nothing particularly demanding I wouldnāt think.
Given that Aionās QUALITY is VRAM dependent according to Topaz, I probably wonāt get great results with it anyway.
I just wanted to experiment with it I guess, because Apollo still leaves something to be desired, in the sense that while itās supposedly AI powered, I would say its quality is merely on par with Adobe Optical Flowās non-ai based interpolation that was introduced 5 years ago. Well, maybe itās slightly better than Optical Flow, but not by a lot.
Hereās a few clips I did as a test. You can try the supplied original clip to see how Adobe Optical Flow compares. If theyāre just using Nvidia Optical Flow, then all of the interpolation models in TVAI are easily better. (You might have some troubles since it seems Adobe is loath to admit that Matroska is a competent digital video container worth supporting.)
I guess my opinion is based mostly on the anecdotal evidence that I still frequently see obvious artifacting with Apollo at a 2X framerate when someone waves their arm across the frame, the same way I do with Optical flow, even if itās a little less obvious with Apollo. In both instances, the footage with the artifacting would still be unusable and must be edited around, or that segment of the video must be replaced with non-interpolated footage.
Iāve heard that said too, just cannot fathom why. I gave an arm and a leg to upgrade from a 422 to a 444 recorder for my camera, and now i have to buy a Mac too?
Could we get an official statement on this @tony.topazlabs?
Feels like Iām missing something when I have to sacrifice half of my color resolution in such an advanced software.
I was considering to purchase an upgrade, but I wonāt make sense until this issue gets resolved.
Could I please get an update/confirmation that this is being looked into?
9 days? But thatās madness. Youāre going to burn up your CPU running your PC H24 for 9 days. Especially for rendering. Itās not good for the machine. And hello energy consumption too.
Hmmm itĀ“s not good for the machine when the cooling is too bad.
Normally a PC should be able to run at max. especially when doing overclocking.
You only burn up your hardware if itĀ“s not well cooledā¦by the way the CPU is able to limit itĀ“s speed when going hot.
Using an cooling-system thatĀ“s not designed for running at max. should never be used in a PC for a longer timeā¦
beautiful teeth!