M1 looks pretty amazing

They gotta go full x3 to make PR & Marketing go x3 BRRRRRRR, otherwise there’s no point :laughing:

2 Likes

@LizziAS

sadly it mainly be only ios games that will work …still move in right direction for apple . I wait for reviews before i order a 13"… No worry me as i no play games

1 Like

Looks pretty good reading this.

Edit: In another article it say’s.

Apple M1 beats Intel Xeon in iMac Pro so badly it hertz

3 Likes

I’ll be looking for a new chip in the Imac and i would be interested maybe.

1 Like

There’s no evidence to support that theory. They have Rosetta 2 playing intel games on their silicon in their pr video. Granted I want to see that in person and test the extent of it but I have no doubts it works. They have smart competent people that love their brand that code very well.

quite the contrary in fact

There are a few things to note here about the new M1 chips and Rosetta 2.

  1. We don’t have any independent performamce tests done by 3rd parties. All we know is that is is n times faster than generic PC. Their marketing material seems legit but in reality doesn’t contain much info. Compare that to AMD’s rx 6000 launch, where we got specific gaming benchmarks against the 2080ti and 3090 in a plethora of games.

  2. Geekbench is not a reliable source for determining system performance especially those using different architectures or operating systems.
    Infact I’d go as far as to say that the industry should reject synthetic benchmarks entirely and rely on “real” benchmarks. IE Blender renders, Premier Pro renders(Kdenlive or Davinci for use Linux users), application complie tests. Reddit user u/OmNomDeBonBon made a great comment explaing this better that I could ever do.

JFYI, Geekbench isn’t taken seriously in the PC space. It’s only useful for measuring differences in performance between chips of the same architecture and similar design e.g. going from A12 to A13, or Ryzen 2600 to Ryzen 3600.

It’s not useful for measuring performance differences between different implementations of the same ISA from different OEMs e.g. A13 and SD855, or i5-8400 to Ryzen 2600.

tl;dr: A13 is faster but we won’t know how much faster until we get real-world benchmarks, not Geekbench.


Edit: a lot of people are asking me to justify my analysis, so let me explain why Geekbench and other similar testing suites aren’t taken seriously amongst most of the technical press and the wider enthusiast market. This has turned into an effortpost, so I hope at least one person gets something useful out of this. Scroll down to the end for the overall tl;dr.

BUSINESS MODEL: why do they say we need their benchmark?

Here are some examples of highly questionable benchmarking organisations:

  • Geekbench
  • FutureMark (3dmark, PCmark, VRmark, etc.)
  • Passmark (PassMark, BurnInTest, the hilarious “MemTest x86 Pro”)
  • Userbenchmark

What do they have in common? Their product is their benchmark. Geekbench is freemium/paid, FutureMark and Passmark products are paid, and Userbenchmark relies on click-through ad revenue and SEO optimisation spam.

This is unlike benchmarks like, say, Cinebench, where the actual product is Cinema 4D, a professional 3D modelling/rendering application. The benchmark suite itself (Cinebench) is free with no “premium edition” - the benchmark exists to tell consumers how different CPUs will handle Cinebench. Likewise, PC games often feature built-in benchmarks, designed to offer repeatable performance measurement in a controlled environment.

The likes of Geekbench and Futuremark have business models which rely upon convincing consumers that the GB/FM performance metrics are trustworthy. They are multiple ways they achieve this, but the easiest method with the highest ROI is to just optimise your benchmark for the market leader’s products. Passmark, for example, is notoriously biased towards Intel, because Intel have ~80% desktop market share and people who buy Passmark likely won’t buy Passmark again if it shows their Intel CPU scored less than a competing AMD CPU. This has the benefit of meaning the market leader will trumpet the benchmark product(s) in its marketing materials - masses of free advertising the benchmarking company, leading to more sales of their benchmarking products.

A benchmarking company’s entire existence is contingent upon convincing consumers to buy their product, convincing the biggest players to use their product in marketing slides, and convincing hardware/software reviewers to purchase the enterprise editions of the benchmark. They have a profit motive to get you to think their products give consumers useful data, when they don’t. I’ll explain why later.

HISTORY: why do we have synthetic benchmarks in the first place?

The ONLY REASON we ever needed synthetic benchmarks was because repeatable benchmarking of commonly used applications in a controlled environment used to be extremely expensive and time-consuming. Decades ago, if you wanted to do a set of benchmarks, you needed a computer that could run 2D and 3D games, MS Office, 3d modelling apps, batch jobs in image editing software, video/audio encoding and so on. Not only was the total cost of this software prohibitive, but the cost of the PC you’d need to bench these products would be through the roof. Additionally, it’d take days to run all of the benchmarks and up to a week to analyse and visualise the data in a way that’d make sense to a consumer.

This is where AIO benchmark suites came in - they provided a highly synthetic benchmark suite which might take anywhere from a minute to a couple of hours, and would spit out a simple to understand numeric score. 17849 is better than 15838, right? The cost of the benchmarking application might’ve only been $100, so something like 1/10th the cost of buying those real-world applications for benchmarking. So, the reviewer not only saved over a week of time, but they saved hundreds/thousands of dollars by not buying all the apps you need to run the benches. So, synthetic benchmarks gained widespread popularity, as a more feasible alternative to real-world benches.

ACCURACY: do they reflect real-world performance?

Lastly, there’s the efficacy of these benchmarks. Geekbench, Passmark, 3dmark et al often produce results which aren’t representative of real-world applications. This defeats the entire purpose of these benchmarks, whose stated goal is to give you performance metrics representative of real-world use. An excellent example is in this thread: Apple’s SoCs destroy Qualcomm Snapdragon, Samsung Exynos and Huawei Kirin in Geekbench, yet this destruction isn’t replicated in real-world application benchmarks.

Indeed, real-world speed tests from the last few years almost always show Samsung’s Qualcomm-based Galaxy S and Note phones beating the iPhones despite the iPhones having a SoC which demolishes Qualcomm’s SoCs in Geekbench. Hell, the inferior Exynos Galaxy S10+ (which is slower than the Snapdragon S10) beats the iPhone XS Max in speed tests: https://www.youtube.com/watch?v=Cv1xwda0xkQ This is despite the iPhone XS Max smoking both the Snapdragon and Exynos variants of the S10+ in Geekbench. There are also videos of the Galaxy Note10+ (same SD 855 as the S10+) beating the iPhone XS Max by an embarrassingly wide margin in speed tests covering a dozen or so popular mobile apps (Spotify, Facebook, Insta, YouTube, Subway Surfer, etc.).

Another example is Userbenchmark. This company was caught changing their scoring system to benefit Intel after AMD’s new Ryzen 3000 series started destroying Intel’s objectively inferior CPUs in their benchmark. They now rank inferior Intel CPUs above superior AMD ones, because they have a business incentive to tip the scales in favour of Intel. Another is BAPCo, a benchmarking company which modified their entire benchmark so that it ran just one test in a loop, a test which just so happened to heavily favour Intel CPUs. Before the modification, the benchmark showed AMD CPUs destroying Intel CPUs. It was then revealed BAPCo was funded by Intel, and this contributed to a court case where Intel to this day still needs to make special disclosures about its benchmarking that nobody else does. https://www.theinquirer.net/inquirer/news/1009264/the-case-against-bapco

SUMMARY

The tl;dr here is, these synthetic benchmarks do not reflect real-world performance to an acceptable level of accuracy, and they should absolutely not be factored into buying decisions. Geekbench and other synthetic benchmarking suites don’t produce useful data unless you want to compare two CPUs of the same architecture with a similar design from the same vendor e.g. to test generational improvements between A12 and A13, or to test scaling between 6-core Ryzen 3600 and 8-core Ryzen 3700X. Gamers, photographers, 3D modellers, etc. generally don’t give a damn about Geekbench, because they see first-hand how the Geekbench score doesn’t reflect real-world performance when you compare different implementations of the same ISA e.g. Intel and AMD, let alone different ISAs (e.g. Apple comparing ARM A13 to an x86 laptop CPU).

  1. It doesn’t really matter if Rosett2 exists or not. At the end of the day, compatibility layers are a workaround hack. Sometimes they work with minimal performance issues, sometimes they have severe performance issues, sometimes they don’t work at all. If anyone understands the pain of compatibility layers, it’s use Linux users trying to get Wine to work properly. It’s why I still keep around a windows drive secluded, just to play games. With how sensitive gaming performance can be the experience, having to jump through hoops to get it running on an unstable buggy and laggy state(considering that most of their systems do not have much power compared to an equivalently priced gaming pc) is not going to pleasent at all.

  2. As someone who dabbles with game development, Metal just plain sucks. Actually, Vulkan and DirectX can be a pain in the arse as well. Only OpenGL work nicely(though, I did not have as much control as I would have with the other APIs). But having to put up with another systems API, just tinget it running on 1 OS is demoralizing, when I could develop for a plethora of systrms using one API(all hail Vulkan and Opengl).

2 Likes

I think I’ll come to my own conclusions in due time based on fact not conjecture. I would trust a more valid opinion based on testing by Phoronix. Right now it looks very promising but it may end up just being all flash & pan. Time will tell the real story. No offence!

1 Like

There’s a small bit of info over on hexus…

3 Likes

Ooooh noooes so it’s x2 more :ox: :poop: …who could have possibly though of that :rofl:

3 Likes

(ARM) Question to Torvalds (realworldtech.com)

As great as the M1 may be, I hope for the overdue initial ignition for other notebook manufacturers regarding ARM. And not only with vaporware.

2 Likes

Ooooops x2 more surveillance :shushing_face:

:male_detective: shh, it was always a thing

3 Likes

This, but replace ARM with RISC-V

A first short check of a MacBookPro M1 (low RAM) in German.

if these machines were made with risc-v there would be no drm for playing netflix and the like, is that right?

i already got the 2020 iphone se so they have all my data anyway… they are watching me watching you watching me :slight_smile:

5 Likes

Yepp… that’s my thought also. With RISC-V taking off we would really be able to get a fully open system from hardware to software. Some players are apparently going the RISC-V route, and developing the technology further, but who knows how much time they need to get a competitive product on the market. I was about to buy a Mac last year but then i noticed they had just introduced the soldered SSD. I was like “are you kiddin me? no way!”.

It seems they are going even deeper into that paradigm: now RAM too.

And all in the name of “security”. Definitely not because they want to get some extra $800 if the buyer decides do get the 2TB model. Far be it from them. It’s just that they want their buyers to feel secure.

Like google offering to spy on you to “offer a better experience”.

5 Likes

As they did in “Braveheart” lift up your kilts, bare your asses to the enemy and shout FREEDOM!!! :wink: :crazy_face:

1 Like

Not necessarily. I don’t believe DRM needs to have ties to physical hardware, as seen with the pinebook pro, as long as you use chrome or enable DRM locking on Firefox, Netflix streaming 4k works(well if you can call 1 fps streaming to work)

Oooh 1 FPS!!!

1 Like

But I just want to call out one, in particular: the frame rate on Shadow of the Tomb Raider . Thirty-eight frames per second is a respectable number for a gaming laptop with a low-end graphics card. It’s nigh unheard of for a computer with an integrated GPU.

1 Like