American semiconductor company AMD has taken the wraps off its new 3rd-generation Ryzen Threadripper processors. The product line is comprised of the 24-core 3960X and the 32-core 3970x models featuring the company's 7nm 'Zen 2' core architecture, 140/144MB cache, and up 88 PCIe 4.0 lanes.
Recent Videos
In its announcement on Monday, AMD provided some examples of the performance increases consumers can expect from its new Threadrippers, including up to a 47% performance boost in Adobe Premiere. That's with the 3970X processor model in comparison to the market's 'top-end HEDT processor,' according to AMD.
The company offers a more detailed look at the technical aspects of its 3rd-generation processors in the video above. Both models will be available to purchase globally starting on November 25. The 3rd-gen Ryzen Threadripper 3960X will be priced at $1,399 USD and the 3970X at $1,999 USD.
Video is such a natural fit to parallel processing that older software such as Premiere Pro can take advantage of it easily. But legacy software such as Photoshop and Lightroom were not designed that way (they're not intrinstically designed to break up an image into pieces and parallel-process the pieces).
The problem is that a whole generation of photographers has become used to the weird and quirky methods of PS and LR. Change will come about when that crowd starts retiring, and a younger, more impatient crowd has no time for clunky and weird old Photoshop.
I used Ryzen 9 3900x in general computations and it's an extremely fast processor for the price. I dare to say it's twice faster for general software than 2 times more expensive Intel processor. Looks to me AMD = 4 x Intel. Intel will have a very hard time to compete in price / performance in the next couple of years.
Thats workstation thing. Not desktop mode. For desktop I would rather buy Intel i7. Intel i7 often wins hand down in gaming most of time. Ryzen can't win they only do better at multimedia than Intel. For gamings, Intel is the only one.
Yeah. This isnât a gaming site... Plus, even when it comes to gaming AMD has basically caught up with Intel. Overall, the new Ryzen CPUâs are incredible value.
Unfortunately most of test shows Intel and Nvidia still well ahead for most of application that is optimized for both Intel and Nvidia. Thats the problem.
"Unfortunately most of test shows Intel and Nvidia still well ahead for most of application that is optimized for both Intel and Nvidia. Thats the problem."
My computer has 4 core i7 7700K and it is turbo/overclock speed to 4.5ghz, it closely match in speed to 3950X at all 1, 2 and 4 core. So no difference at all whatsoever.
Capture One Pro will eat up virtually every core you can throw at it (at least up to 10 or 12 cores). Unfortunately, with Intel's CPUs, there's a drastic line of diminishing returns on your investment when you opt for 10+ cores. The 9900X is $1,000 USD. That's a hefty premium for a bit more than a marginal return on performance.
On the flip side, the 12 core 3900X is only $500, which is $25 more than the 9900K and $100 less than the Intel 9900KS. In photoshop, the performance level between the two is negligible, so AMD has caught up. In Lightroom (and I'm also believing Capture One Pro), AMD is the better performer.
Capture One also utilizes GPU acceleration throughout. 8 cores is the minimum you should you be using with C1P, and I bet these new AMD CPUs just crush it.
Photoshop still relies on less cores with higher frequencies. BUT! AMD has shut that gap down. Intel is no longer the "Photoshop King" like it was with the 7700K and the 8600K CPUs.
The biggest improvements I think people would notice (if they have a slow system and are looking for options) is to add a lot of memory, convert to using an SSD for your editing, and get a dedicated graphics card if you're still using an "onboard" or integrated graphics. The memory and SSD will likely yield the biggest performance boost for most and it doesn't break the bank to do those upgrades (and you can usually do them yourself).
People should also keep in mind that software will only utilize however many cores it is coded to utilize. So an app threaded for single core, will only use one core, even if you have 24 cores. The other apps that are multi-threaded will probably run smoother, but just because you have 24 cores doesn't mean that all programs will use all 24 cores.
The people that will benefit from the extra cores would be people like video editors editing 4k or 8k video, or people processing (batch processing) large amounts of high-res uncompressed RAW files perhaps, but for the average photographer/photo editor, this would be overkill and you might be better off spending your money on an 8-core processor with good dedicated graphics, a fast SSD and lots of memory.
They are too busy with their PS and LR for iPad, we aint important for them, don't bother, just pay your 10 bucks per month for photography package and enjoy Adobe's legacy code.
We are on the Dpreview, photography site, main product we are using is Adobe Lightroom. I have 7700k with 32Gb of ram and lightroom utilize 10% of CPU at peak while impossibly lagging, thanks Adobe. I pay for years 10bucks per month for your outdated unoptimized products. Now i apparently shoud buy Threadripper so my CPU would 4x more powefull so Lightroom would use 2.5% of it capacity. Thanks again Adobe. I love you so much.
Going from a 5-6 year old 2Ghz 4 core laptop with 16Gb RAM and onboard graphics to a 4.6Ghz 12 core desktop with 64Gb RAM and high end graphics didn't make much difference to Lightroom.
Batch processing and exporting is noticeably faster, but Lightroom still gets laggy and unresponsive. I close it and reopen it every so often to keep it usable.
I think anyone upgrading their CPU just for Lightroom will be wasting their money.
So much this. My LR seems to eat RAM like no tomorrow after editing a handful of photos. CPU usage doesn't seem to be a factor unless importing/exporting.
I consider LR slow but rarely ever "grinds to a crawl', unless I did hundreds of spot removals on the same photo, in which case I'd rather do it in PS and save it in tiff, and I'm on really mediocre hardware. Maybe consider splitting your library? A smaller library may help.
So why aren't you using Capture One Pro? It utilizes multiple cores (at least up to 10 or 12) very efficiently, and GPU acceleration throughout. It's silky smooth and wonderful fast.
I have high end gaming desktop and I remember back then use to have problem with older version of Adobe Photoshop CC that crashed before and its ridiculous. Got nothing to do with hardware whatsoever. Now its alright with this latest version. I am not quite fan of lightroom because photoshop is my main preference.
I'm happy that AMD has been back on its feet again (for a while now). The Intel monopoly tried everything it could to kill AMD, and they almost killed AMD.
Obviously these processors are not for the majority of people on DPR, unless they are in the business of 3-D modeling and such (or, maybe, 8K video). I built a desktop last year using AMD's 8-core Ryzen 7 2700X (unlocked processor). Gen 3 has been out for months now. The motherboard can handle the 3700X, but I see no need for that. Graphics, video, simulation, and other engineering applications run in a fraction of a second. I love the processor. I switched to building with AMD processors back in 2002, except for one desktop that I built more than a year ago using Intel's 14-core Xeon E5-2683 V3. Way to go Ms Lisa Su. Kick that monopoly very hard in the ars!
It's true that Intel did a lot of bad things to crush amd including illegal monopolistic practices. However, the primary reason AMD was on the ropes for so many years was their wrong choice to go with a lower single thread performance design.
To all Adobe competitors: Do you find it hard to break Adobe's grip on photo editing software market? I have a suggestion. Attack LR and Photoshop's Achilles Heel: slow software. Hire a few really good software engineers who specialize in multi-core performance optimization and optimize the crap out of your software. Adobe is improving bits here and there (preview generation, export, etc.) but their progress is painfully slow. This may be your only chance here. People may not switch for minor differences in features and pricing but they WILL switch if your software is several times faster which is doable IMO. Don't have the money to hire? Pitch the story to VC, show them the market cap of ADBE.
It is not that there aren't any editors better and more efficient than Adobe's products. It's the fact that it's an Adobe cult. You have people who have been using Adobe products for a long time, and not willing to switch because of the typical fear of the unknown. Plus, they don't want to spend a bit of time learning a new editor. DPR has had numerous articles about competitive products. Capture One, On One, Zoner X, Corel Paintshop Pro, ..., etc. Personally, I have never used, and probably will never use Adobe products.
I'm agree to some extent but perhaps they are also not doing a good selling job. For example, where's detailed info on performance gain if any? Where are step-by-step guides/videos for potential switchers from Adobe? They should be on the front page of their product web site but I don't see them. Very few will bother scrounging for such info.
> several times faster which is doable adobe lightroom utilize my 7700k/32Gb ram at 10% at peak, while impossibly lagging. its doable to make it 10x faster without any effort
I tried C1 and find it really hard to like, the sliders seem quite unpredictable. Also Adobe's at least doing something right as Android LR on my phone isn't much slower than LR Classic, so it's apparently several times as efficient considering the difference in hardware.
This sounds like the Nikon and Canon users complaining about the Sony menu system, and they refuse to move to any other camera manufacturer, even though they're upset at Nikon and Canon for their lack of innovation.
It took me a couple of hours to get used to Sony's menus, and now it's perfectly fine. It took me a couple of hours to get used to Fuji's menus, and now it's perfectly fine.
Capture One is perfectly fine in terms of usability. It's actually quite fantastic.
Not me - I swapped from Canon and don't really see much difference in Sony menus in terms of usability. I don't understand why people complain at all ;-)
We're all different though, and for me Capture One just didn't gel. I know people swear by it and I don't doubt that it's good, if you take the time to learn it.
I got the Sony version and then C1 moved the goalposts on upgrades. Business is business, but it didn't endear me to them. That and they kept sending me emails in German. Didn't scream 'customer experienced focused' at all...
@MikeDPR There are tons of video tutorials. Here is just an example (for ON1), with lots of pages: https://www.on1.com/videos/
Also, all other developers have YouTube channels. Corel Paintshop Pro has lots of videos on its channel on Youtube.
Zoner has tons of tutorials and a regular newsletter with various topics, with an interface that tries to mimic LR.
Generally, all of them have a free 30-day trial.
I agree with other people's comments about Capture One. I find its learning curve to be quite steep. Perhaps because it was originally aimed at Apple users and never was meant to be for Windows users? At least that is what I think. I have tried to use it several times, but I was frustrated with it every time. I finally gave up on it. So I don't use it or recommend it. My personal favorite is Corel's Paintshop Pro, although the company's support could be better. These companies, or most of them, do not have Adobe's deep pockets to spend on marketing.
@sh10453 Thank you for the link. But it's not that they don't have much tutorial videos. But they need "Switching from LR? Here's how" in one 10 minute video, prominently shown on the front page of the product. Sure it's good to have a library of detailed video for each task, but people don't like wasting time searching and viewing dozens of videos just to get started. It doesn't really take much marketing dollar. It just seems to me they just don't get what switchers really need.
I think they just don't want to get in a war with deep-pockets Adobe. Very much any software, not just photo editors, has a learning curve. With a 30-day free trial, it's worth looking at. Any switcher needs to invest a bit of time learning a new software package.
I started from LR and having tried Snapseed for more than 30 days, more like 120, I find Android LR still far easier to use than Snapseed, and to think Google has far deeper pockets than Adobe...
after few years on sony a7 i taking into my hands d750 and sony seems so better in terms of menu. canon is outstanding in terms of menu, but still i so used to sony, that it seems perfect
The only people I can really see who might benefit from this would be video editors editing and/or encoding a lot of 4k or 8k video, or perhaps people stitching gigapixel HDR photos or some other crazy stuff (although a lot of software will also use dedicated graphics if available too, thus lightening the load on the CPU in some tasks).
In case you haven't seen this, the new Ryzen CPUs are better than the best Intel can offer for Lightroom performance at all price points, and at the very top too:
As far as I can remember, Intel has always been the best choice for Lightroom, the Zen 2 architecture and AMD's aggressive pricing have changed that.
The new Threadripper CPUs will likely help batch exports, but otherwise a Ryzen 3000 system is likely to be a better bet for Lightroom. The Threadrippers will be better for other forms of content creation than still image processing.
I'm very happy with my AMD 2700 , fastest cpu I ever had , I can do video editing , photoshop and more without any effort. We don't need Intel ancient technology plagued with security flaws , overheating and power hungry , not to mention overpriced.
nice. but as "pure stills content creator" (aka "photographer") I would be much more interested in 1. an all around great, well-performing, newly developed, Raw converter plus image efiting software ... 2. hardware (cpu, gpu, mainboard, interface, connectors, canles, monitors) and software delivering a true 10-bit color workflow ... after a decade of emoty promises to that effect. until then i will not spend more than 299 on a cpu, no matter how many cores it has and how many threads it RIPs.
Not everyone here is just a photographer. I combine photography and video with VFXâCGI. Being able to render out my shots faster is critical. This is a pretty nice release.
I am nearly done with Adobe. But don't find any of the current LR alternatives truly compelling for my needs. My hopes are for some other software company to give Adobe exactly the same kind of leapfrogging product competition AMD is currently dishing out to Intel.
it benefit adobe premiere much more, it is less beneficial (but still better then most other chip) for photoshop or lightroom. A photoshop simpler commend don't need 64 cores. Those commend get executed with like maybe 2-4 cores instruction at most. photoshopper look for clock speed (how fast an instruction get done) and some ram (6-18 core are plenty enough). If you are into making movies, rendering, server processing, and some gaming, 64 cores will benefit you. If you have multiple apps open it will need some ram, if the apps or software are running (let say you are rendering in the background) more cores are better for more command executions. This chip is huge and needs some power to run it and then some nice led rgb fan to cool it (if you actually use the cores). Also gpu the graphic card takes on 3d point cal and shorter instruction. cpu usually take on the longer instruction.
You don't need 64 cores for gaming, even 32 or 16 will be unused. Vast majority of games will sit on 1-4 cores. Rare games will use more, but FPS boost is small and it's still 8 cores loaded at best.
Fact is the computer will only use however many cores the software is written to utilize. A program written for 4 cores will only use for 4 cores/threads. If the program is not multi-threaded, then you'll only use one core/thread, even if you have a 32 core processor. The other multi-threaded apps will utilize the other cores, but not a single-threaded application.
Most games I've seen will use (up to a max) of 8 cores, plus games have become more reliant on the video card these days, which does the bulk of the processing and rendering.
I have a professional architectural software called revit (%2-3k a year per seat, 15 people in the office x3= $45,000). I just rendered an image all 6 xeon cores are at 100% it has hyperthreading which is 6 more fake cores processors. clockspeed running full 100%. workstation quadro graphic card is at little 4%. I do real time raytracing cpu of model, cores also area all 100%, the graphic card is now up at 40%. Gamings use graphic card al the time as well as realtime rendering. Rendering is mostly cpu. High end software are designed to use a lot of cores. I would probably benefit from lots more cores. Surfing the internet not so much. I usually have 3-5 software opened at work all at once running in the back ground, granted I am really focus on 1. That is why nvidia raytracing (RTX) graphic card are so nice, it does what took forever in architecture a few years ago to almost instant. (mine is not an rtx card, at east 3 years old card)
revit use v-ray cpu based render engine. quadros are a scam, same hardware as gtx series, only at 3x price. i do own one, by the way, this time i would just buuy a "gamer" laptop/pc with much more hump per buck and that's it.. for architecture, fast gpu renders are probably the future, especially with new rtx based real time vray tracing and all that. just today new twinmotion have been postponed to q1 of 2020, fro example. i think as a revit (or archicad) license owner, you can get new twinmotion for free, go with that. otherwise, lumion is better ooption (at keast till niow), imo, but it's not cheap.. as for imaging etc, today nothing really beat amd based cpu pc with some heavy gpu with a lot of cores (as i said before, don't be stupid and pay much more fro quadros with same amount of cores). more and more software gets gpu boost. da vinci, lumjion/twinmotion, blender, metashape etc etc, all with heavy gpu gain, and list keep rising. good times..
@spectro: Amazed to hear your staggeringly expensive software is not fully utilising your GPU! I use Blender ($0 per seat) for product rendering, and it will happily max out the GPU to 100% when rendering (and has been capable of that for years). If you have two GPUs fitted, it uses both at 100% to half the render time. The i7s and Xeons in my PCs have been irrelevant - 2000+ CUDA cores does the job much faster than a CPU, and costs less, provided you avoid the great Quadro scam.
I am just an architect at an office. The IT people/company put the computer together. REVIT is the architecture software standard, once was autocad and we would rendered in 3dstudio, formZ, lightscape. We don't render much on a daily basis. Rendering are usually done by younger people in the firm.. Most firms give you a new computer every 2 years or upgrade parts, why most arch firm use PC, I guess a few mom and pop firm use Mac, but they don't get revit (pc only autocad for a while). Technology gotten so much better theses days. When a computer crashed that is 2-6 hrs of lost data or corrupt file. You would go to the IT guy and ask for yesterday backup to save over the corrupt files. that doesn't happens too much these days. software have gotten better, so many bugs back then. I don't believe in quadro anymore with the new rtx nvidia. Anyways we got a new server, since more new people working. So a chip like this is ideal for a server. I think Europe use archicad more.
I heard of blender but I never used it. Quadro and ati crossfire/ workstation cards, I believe are openGL optimized language. Gaming is more MS direct X. Maybe the line has blended between gaming and workstation now, but not sure. I believe revit realtime is openGL (they don't call that anymore). V-ray is an add on plugin we have for revit rendering ( I dont use much).
I think REVIT utilizes Gpu acceleration. Most design apps these days do. Some will even work with non-dedicated Gpus (ie. "integrated graphics") but they are much slower and the whole process would be slower not to mention susceptible to crashing in some cases. Some products just refuse to work if you don't have an appropriate video card (like Solidworks).
@spectro: i must react on revit as industry standard: yes, most of bigger firms outside eu (beside japan etc) use it, but this is only because of heavy and aggressive marketing policy autodeks have around the world (when bim transition was happening, all autocad user more or less just jump into autodesk trap). the whole 3d bim standard in architecture wasn't lay out by revit, but by archicad (and some others), which have been here from 80-ties! for ex., with all that 3d interface etc. revit ic basically copy of it, they knew they have to something different, since their architecture versions of autocads were a joke. the same bs stereotype that i keep hearing for archicad, is that for small residential projects only etc. well, we work on 30k m2 +projects in teamwork mode without any problems on laptos on multiple motors etc. firms like big etc use it around the world, ranging from small to colosal projects. i would considered that standard, too, but i guess marketing knows better..
lol. cpus like this may come down to 500 in a few years. will be decades until we get this kind of processing power for 50. progress has slowed down considerably. but good that AMD has "ryzen from the dead" and leapfrogged Intel. this should help us to get better hardware at better prices. :-(
"Just imagine what kind of performance you would get if Adobe would do a total re-write of their code for modern OS's!"
i have 7700K/32Gb ram. LR utilize 10% of CPU at peak, while impossibly lagging. We wouldnt get nothing but we just pay for and what is owed to us: normal perfomance, nothing more. Right now Adobe's legacy 2003 single-core code, which they btw just bought from another software company, critically unoptimized. I even don't talking here about Photoshop which legacy code cant utilize more than one core. People buying 16 cores and PS doesnt see 15 of them, so how does Threadripper would boost any pefomance for photographers? LR would utilize 2% of capacity? PS wouldnt use 31 core? Thanks Adobe, many Thanks. I'm so glad i'm paying you that 10 bucks per month for photography package. I'm so glad you had $3.62 billion net income in 2018. Cheers, Adobe.
A lot of it has to probably do with the algorithms themselves that are used for things like spot healing, content aware fill, etc. Those of which would not necessarily change even with a new re-write of the software anyway. They might be able to speed up the UI a bit, but the core functions would likely stay the same as they have spent years perfecting the algorithms, and in some cases, it's just new code stacked on existing code.
And let's not forget that sometimes a complete rewrite can sometimes make things worse than they where. Remember Windows Vista when it was released? I won't say that Adobe products are perfect, because they are far from it, but they aren't "broke" either (at least, their laptop/desktop apps. Obviously PS for iPad has a long way to go).
@Sirhawkeye "Remember Windows Vista when it was released?"
Windows has not been new code since NT was released. It's still the same core.
I understand that some of the algorithms are going to be the same but not having true multitasking, multi-core built in is just insane.
Using Indesign vs Affinity is night and day different. There is no "low res or high res preview in Affinity. it's always high res but it's faster than ID is in the low res mode. Why? The code is new. It's made to run on the latest hardware. Indesign has a problem that it runs slower on the newer Macs just because Adobe refuses to use code with Metal (1 or 2). Yet the performance is so much better.
@brycesteiner: Vista was a rewrite of the kernel and userspace. Actually, Windows 7, which is lauded so much, was only an improvement built on Vista's foundations. For sure, it was leaner, but also hardware caught up with requirements. Windows 8 was a new kernel and improved speed over 7, but due to mishandling the launch and focusing on new form factors, it was a massive flop. Windows 10 was seen as unnecessary and even though it's faster and better in many regards, it's disliked by many users because it feels unnecessary in light of how usable Windows 7 still is (and how Windows 7 works flawlessly on older computers). @iozx: You're right that it's not necessary to completely rewrite Photoshop and it's so modular that single modules can bring down performance even if the rest of the product was fast. But I will disagree about multithreaded use. Photoshop and Lightroom are able to use the extra threads available to speed up batch processing and some fliters.
To be honest MIcrosoft goes through these good-bad flip flops with it's OS according to many people (both IT pros and the general public).
XP was a stable OS after the first SP (although users may get the occasional random reboot and recovery message popup which also was seen in Windows 7 but I haven't seen since them since in 8 and 10), Vista was a flop (I didn't use this OS that much myself as I had read enough to avoid it personally and it was short lived--like 3 years), 7 was for the most part stable , 8 was a flop (had a lot of issues out of the box, but 8.1 which MS considers a different OS, fixed some some of the important issues like networking and stability), 10 as far as I can tell is among the most stable OSes to date. Doesn't play nice with some older hardware but that's expected somewhat. But also in the past 4 years I've used it, i have yet to see BSOD.
most users of Photoshop are ordinary people, photographers or amateurs, and CPUs with this price are not for them, they will be suitable just for very rich commercial photographers or other also very rich users.
Not sure about Adobe, but my Affinity Photo app seems slow on an 8th-gen Intel chip. I think we are just processing massive amounts of data anymore, thanks to extremely high resolution and lots of post-processing technique.
I don't have experience writing image-editing software specifically, but on the basis that I've done quite well over the years out of fixing less-than-wonderful attempts to write performance-critical multi-threaded code, and knowing how stonkingly powerful modern processors are (and by modern I mean most stuff from the last 10 or so years), I'm going to hazard a guess that most perceived performance problems are indded down to code quailty, rather than inadequate hardware.
Take Affinity Photo: much as I love it, it has problems in this area. For example, the Colour Replacement Brush Tool is agonisingly slow on my machine, and all-the-while completely maxes out a single core, for even quite minor edits. *If* calculating the replacement really is all that processor-intensive, couldn't the operation be multi-threaded, or even farmed out to the GPU?
Alas, I find that the color replacement brush on Affinity Photo and Photoshop are both agonizingly slow. I'm working on a maxed-out (8-core, 32GB) 2019 15" Macbook Pro.
Interesting...it almost suggests that there's something inherently processor-intensive about colour replacement. Though I did try the same thing on an ancient version of PS Elements, and it was fine for speed/responsiveness.
You would speed up nothing. I payed 1000 bucks for a new PC, 7700k/32Gb ram few years ago in a hope that it will speed up LR. And now i'm happy owner of powerfull CPU which Lightroom's 2003 legacy single core code utilize at 10% at peak, while still impossibly lagging. So don't bother. Btw Photoshop, flagship Adobe product, cant utilize more than one core. So you would buy Threadripper for $2000AU and Adobe Photoshop would use 1/32 cores of your brand new $2000AU CPU. Great, isn't it?
It's more likely that we are getting to a point where software demands are growing at a faster pace than what hardware can provide (obviously). I mean the innovation curve for CPUs has sort of of flattened out over the past 10+ years, with new iterations of the processors bringing less additional features (in terms of processing power) compared to the previous generation. For example, the newest 9th gen Intel processors from what I can see, don't bring any substantial improvements over the 8th gen Intel processors. The newer ones probably have better power efficiency, but as far as raw processing power, probably not much more, I'd say maybe a 5-10% increase at most, but this will also depend on what two processors you compare).
thx1138 while you are enjoying yourself playing the ever popular attack Adobe game :-) Adobe does not have any part in this video or article it is all about AMD and their claims. The CPU companies should also be taken to task for some of their nonsense claims . The therortical benchtests displayed in the video are showing around the 1.5x speed of the intel chip. Meanwhile the guy from the computer graphics firm is telling us that something that used to take 5mins now takes 5secs , I am afraid I cry BS on that , there is absoloutly no way in a legit high end set-up test that this new chip will result in a 60x faster real life result
Get a clue, Adobe's software has always been poorly optimised. Like when LR 7 came along boasting big speed improvements and ran like and absolute dog on very high end systems. The forums were in meltdown and the number of idiotic things Adobe wanted you to do to try and fix their cr@p code was ridiculous. It will take a Thread Ripper to bring it up to average performance.
@thx1138 nice that you have went off at a tangent to attack a point I never made . You were having a rant at adobe who had nothing to do with this video that was all I mentioned I agree with you that adobe software could do with a huge performance upgrade. However , I stand by my comment that the guy is blowing smoke claiming what takes 5mins on a current high end system will take 5sec . And frankly anyone who expects that to happen in real life is in for a huge disappointment
AMD is on the run with desktop CPUs at the moment, servers too. I am rocking the main stream 12c24t Zen 2 Ryzen 3900X, great CPU, can't complain coming from i7 4970k...
But at this very time Intel still has better offerings when it comes to laptop market. Quite powerful i9 8c16t cores and similarly. Just got HP Omen with 6c12t i7 9750h and GTX1660ti and it crunches through Davinci Resolve, DXO Photolab etc. quite well. It has decent gaming centred IPS at 144hz too. For £1030. Not a bad value at all.
But desktop - best bang for the buck - AMD, apart from PC solely aimed at gaming where more cores is not necessarily an advantage, the IPC - or single thread performance is more beneficial and Intel gets there with his 5GHz clocks and also helped by lower memory latency which games are sensitive to :-)
The newest Intel CPUs have the major security flaws fixed to, but not all and who knows what comes next. Anyhow the patches of the past for broken Intel affected performance of NVME before
While I have no interest in these TRs, which honestly don't make sense for normal consumers (even professionals), I am seriously considering upgrading my old i7-3770 to a Ryzen 5 3600 or 3700X at some point.
I also abuse multi-tasking, often with several Adobe programs, several MS Office programs, Chrome, iTunes, and sometimes CAD software running simultaneously. So having a few more threads will help as well.
Most users will see better performance from more pedestrian 6-core or 8-core units with higher clocks. Only software that is coded to be highly threaded will utilize these processors.
Remember when dual-core chips were exotic? :D I remember building an Intel 820 in college and being wowed at the processing power!
I wouldn't say it's specifically for content creators.
It takes an extremely specific kind of workflow to take advantage of more than 8 cores, let alone 24 or 32 cores.
These seem more targeted towards small business server applications, or scientific / engineering simulation work.
Aside from that, CG render-farming and some video workflows, but depending on the specific application, some video work will also favour higher clocks over more slow cores...
@Androole It's a platform that exists for a purpose. People shouldn't even start talking about "most users", this point of view is irrelevant. And it's not just about the number of cores, that's why I mentioned the socket and chipset. You're buying much more than just CPU cores with Threadripper.
I moved from i9 9900k to amd 3900x and did i made a mistake? hell no. amd owns it. specially the speed rate of export! but dont get an asus for that chip. my asus died withing a month and moved to msi.
Threadripper will certainly be the best you can get, but you can buy "just" the new "normal" AMD CPUs like Ryzen 3700x (8 cores, $330) or 3900x (12 cores, $500) and still reap humongous gains from whatever Intel CPU you probably have now. We are talking 2x RAW develop & export speed up from 8 core top-of-the-line Intel CPU like the i9-9900K. See benchmarks comparison here. What's even more impressive is they accomplish it while consuming LESS power than the 9900K.
Interesting results regarding Zen 2 vs the first gen, thanks for linking it... That being said, it still seems like LR is only very well threaded for the export/import operations and actual editing doesn't benefit a ton from extra cores... The former is surely valuable for anyone churning thru a ton of shots but it doesn't really move me to upgrade my aging 6700K.
Not like I'm looking to stick with LR anyway, the linked article also makes some comments about LR being better optimized for using multiple cores during exports than Capture One, so I imagine a lot of the other non-Adobe alternatives are even worse off in this regard, sigh.
Don't misinterpret my previous comment btw, if I was building a system today it would absolutely be around a 3700X, or a 3900X if I wanted to splurge a little and have more cores for video editing... I've got nothing against AMD.
I was all too eager to give AMD my money during the A64 days, I'm glad they're giving Intel stiff competition on the desktop end these days since things had gotten very stale for a while. I ended up with the quad core 6700K precisely because Intel had rested on it's laurels, hexas and higher core counts should've been cheaper by 2015...
I'll probably give it another gen or about 1y before the upgrade bug bites me again. Hopefully AMD starts getting more design wins on the mobile/laptop end as well, tho I'm not sure how efficiency stacks up there, they've made huge strides on the desktop side tho.
I just put together a system with the 12-core Ryzen 3900x. I have two VMs running, a 24-hour background H.264 encode, Photoshop, LR, an occasional compile, and a YT video playing. It's barely breaking a sweat.
One thing to consider about Ryzen is they're not vulnerable to many of the side-channel processor exploits that plague equivalent Intel processors. This means very few performance-robbing microcode and OS kernel workarounds, extending AMD's effective performance and value lead over Intel.
Would Photoshop CS6 and Affinity Photo benefit from so many cores? Video and 3d rendering is one thing, but bitmap application where the software doesn´t know what´s going to happen next, I´m not sure about. However, since I do play chess and chess software can be parallelized big time, the prospect of having a good chess program run on 32 cores might be interesting....
@hjs_koeln YT(you would need the latest versions to get full benefits. I doubt CC6 would take advantage of more than 2 cores. You would end up with 30 cores in idle doing nothing.
There is other photo editing software that utilizes more than 2 cores. Even so, 32 cores is overkill - especially in terms of price to performance.
If you edit and render video as well, then there's a reason to have 32 cores. I believe 8 cores is still the sweet spot for photography. It's tough to beat the price to performance of the $330 USD 3700x.
If you need that extra little oomph, but are still on $2,000-$2,500 budget, then the $500 3900x sounds like the best bang for the buck. The 3900x still kicks ass in Photoshop, even though 8-10 cores may be sitting idle for most tasks.
There's a huge difference between a want and a need. I want it, but can't justify it. Others will need it and that's where the cost can be justified. Same as camera gear basically.
These are physically large cpu's as well and slowly more mother boards will come. Times are interesting for sure and what will these cpu's be worth in two years time ;-)
Yes, this is high-end gear for designers who need speed all day. Happy with my FX-8320 which is worth, what, nowadays - $5? Hm, just checked, Amazon is still selling it for $183. Baffling - bought it maybe 7 years ago. ?????
Yeah same here. Don't need it, but it would be nice to have. Then you need all the other things to go with it, RAM, graphics card/s, SSD's and they need to be the best as well. I built mine awhile ago now and got theAMD Ryzen 5 2600 6-Core CPU and for the cost, I have no complaints. Works well and will for years to come. The other is an intel NUC, small and still fast enough with an old I5 in it.
Same. Only 2 weeks ago got myself 3900X 12 Core AMD beast. This thing can handle anything. The only issue is motherboards for it seem like they been rushed. Still in process of updating bios and tweaking.
@milkod2001: your review has the opposite effect on me. The last thing I want to be doing after buying a CPU and motherboard is updating bios and tweaking. That makes a "do not buy" for me.
@XXTwnz I'm happy with my new AMD system but honestly if i could go back i'd just get Intel 9900K. Friend of mine has it and he just put the whole system together and enjoys it. No messing with anything.
Any problem that decomposes neatly into parallel processes will get dramatic speed-ups. Factor of 4 over 8 core. Great for scientific and engineering workstations. Lots of fast I/O. People will be studying how to make best use of this.
You would get zero pefeomance boost in Lightroom, dude. I have 7700k/32Gb ram, LR utilize 10% of CPU capacity at peak, while impossibly lagging. It's so unoptimized that you would get zero boost for your 1400 bucks. Basically Adobe just bought Lightroom in 2003 from another software developer and i guess they still didn't change the code. Better get new lens or camera on that money or SSD for your photography storage.
The latest Lumix puts a Four Thirds sensor in a full-frame body with boosted AF and a wealth of stills and video capabilities to create a Swiss Army Knife of a Micro Four Thirds camera.
The fourth camera in Leica's SL series of full-frame mirrorless cameras sees the 60MP BSI sensor from the Q3 and M11 models arrive with a significant interface redesign.
The Fujifilm X100VI is the sixth iteration of Fujifilm's classically-styled large sensor compact. A 40MP X-Trans sensor, in-body stabilization and 6.2K video are among the updates.
The Nikon Zf is a 24MP full-frame mirrorless camera with classic looks that brings significant improvements to Nikon's mid-price cameras. We just shot a sample reel to get a better feel for its video features and have added our impressions to the review.
What’s the best camera for around $2000? This price point gives you access to some of the most all-round capable cameras available. Excellent image quality, powerful autofocus and great looking video are the least you can expect. We've picked the models that really stand out.
What's the best camera for travel? Good travel cameras should be small, versatile, and offer good image quality. In this buying guide we've rounded-up several great cameras for travel and recommended the best.
If you want a compact camera that produces great quality photos without the hassle of changing lenses, there are plenty of choices available for every budget. Read on to find out which portable enthusiast compacts are our favorites.
Above $2500 cameras tend to become increasingly specialized, making it difficult to select a 'best' option. We case our eye over the options costing more than $2500 but less than $4000, to find the best all-rounder.
Sony has announced major firmware updates for four of its current full-frame range, bringing many features and behaviors up to its latest standards and adding C2PA authenticity verification capabilities.
Ricoh has developed a "highlight diffusion filter" that slots into the optical path of new versions of the GR III and GR IIIx premium compacts, giving images a soft effect.
The Sigma 50mm F1.2 DG DN Art is a fast, comparatively lightweight prime lens aimed at travel, portrait and street photographers. We ran the new lens through its paces in and around Seattle.
Fujifilm US has canceled a large number of X100VI Limited Edition orders from the last week, citing "suspicious" details in the orders. Buyers will have a second chance to snag one through a raffle. We'll update this article with information on the US and other markets as it becomes available.
The latest Lumix puts a Four Thirds sensor in a full-frame body with boosted AF and a wealth of stills and video capabilities to create a Swiss Army Knife of a Micro Four Thirds camera.
Astronomy expert and photographer Dr. Tyler Nordgren thinks you should "see your first eclipse, photograph your second." But if you do plan on taking photos, here are a few tips from someone who's been there.
Instant cameras continue to grow in popularity. There's more than just FujiFilm's Instax line to choose from, with offerings from Kodak, Leica, Lomography, Canon, Polaroid and some indie projects joining the fray. With prices ranging from $50 to several $100s, it can be hard to pick one out, but fear not, we've cut through the noise to break down which ones are worth the price of admission.
We're working on finishing up our review of the Panasonic S5II and S5IIX cameras, so when we went to Japan last month to attend the CP+ camera expo, we took the S5IIX along for the ride for some additional testing. Check out our sample gallery, and stay tuned for our review in the coming weeks.
Since its launch in January, users have complained of a camera bug on Samsung S24 Ultra. In a recent post on its community forum, Samsung acknowledged the bug, noting the company is working on a fix.
7Artisans' first autofocus lens, a 50mm F1.8 for full-frame Sony E-mount bodies, is now available. From a spec point of view, this "nifty fifty" offers a metal chassis, STM lens motor, aperture ring and support for all of Sony's autofocus features.
Following the launch of the X100 VI, we interviewed Fujifilm's Yuji Igarashi and Jun Watanabe about the importance of the series, how they balance stills and video and what's going to drive the industry next.
We've continued to push through the final review of Sony's a9 III, accumulating a fair few images along the way. See how Sony's latest sports shooter fares with sports and still-life subjects alike in our latest sample gallery.
We're almost a quarter of the way through the year already! Here's a recap of the reviews and testing we've done this year so far, with more (and more and more) to come as the year goes on!
With nearly 20 edits to a now infamous photo altered by the UK royal family, it's a reminder of just how easy and normalized misinformation can become if we don't call it out and preserve the need for truth in images.
As part of our ongoing review, we've added the Panasonic G9 II to our studio test scene. This allows you to compare its performance and download Raw files.
What’s the best camera for around $2000? This price point gives you access to some of the most all-round capable cameras available. Excellent image quality, powerful autofocus and great looking video are the least you can expect. We've picked the models that really stand out.
Firmware v5.0 for the Nikon Z9 enhances Auto Capture, adds Rich Tone Portrait color mode, along with a series of feature tweaks and refinements across the camera.
Nikon has updated its SnapBridge remote control and image transfer app, adding an "Easy Shooting Setup" for entry and mid-level cameras, from the Z30 up to the Zf.
An eBay user briefly listed the Fujifilm X100VI Limited Edition camera for a whopping $18,000. While that particular listing has since disappeared, others are selling for over $4,000.
The Sony World Photography Awards has announced its category winners for the 2024 Open competition, including portraiture, landscape, travel, street photography, lifestyle and more. Take a look at the winning pictures to see what got the judges' attention this year.
At the core, both Sony's a7R V and Leica's SL 3 share a fundamental building block: a 60MP BSI CMOS sensor. But almost everything layered on top of that foundation diverges in radically different directions. With the SL3 having dropped earlier this week, we decided to take a look at this pair in detail.
Comments