Android performance in 2016 is (less) embarrassingly awful

I was just pointing out that single-thread performance matters, and often drives the overall - I think it extends across platforms.

Nice chart. I only looked briefly at that article but the benchmark was all done with one computer - he disabled threads and cores so it’s not surprising that performance drops, not exactly real world. On the other hand, PC Perspective benchmarked Ashes of the Singularity, real world, directx12, 1600p resolution… i3 FPS is 39.8. i7-6700K FPS is 42.3. Six percent faster for something like 250% more money. Obviously cherry picking but it goes both ways.

Anyhow I love to debate processors… and I’m sure some people out there appreciate the benefits of i7 in a lot of apps, no sarcasm.

https://www.pcper.com/files/imagecache/article_max_width/review/2015-08-16/ashes-gtx980.png

I’m sorry, I don’t think I was clear. That chart isn’t actual framerate, there’s no specific GPU related to it. That’s the framerate if the GPU wasn’t holding back the CPU based on numbers provided by the developer. I agree with your comments and your chart that for current gen gaming situations there is absolutely no reason to buy a high end CPU because current gen GPUs are the bottleneck. I wouldn’t recommend anyone w/o very specific usage buy a top of the line CPU right now.

There are obviously sweet spots in the price performance curve that will be dictated by the lagging components. But declaring that 2 or 4 cores are the most needed for technological instead of current sweet spot reasons is like claiming in 2005 that 4GB of RAM was the most anyone would ever need just because 32bit OSes didn’t take advantage of more at that moment in time.

This reflects a profound misunderstanding of how memory is allocated on a PC. In 32-bit land, you couldn’t even use more than 2GB of RAM effectively.

For starters, it is hilarious to me that you linked me an article where you yourself recommend 3GB as some kind of mic drop on why anyone saying more than 2 has a ‘profound misunderstanding’.

Secondly, why bother responding to posts you clearly didn’t read. My literal point is that there are technological sweet spots at the time but those are not long term things. In 2005 64bit processors and OSes were available and gaining traction. Pentium 4 64bit instructions with a 32 bit OS was becoming a common config. Hardware capabilities were being held back by the lagging technology. This was and is my point.

Basically, processes can’t allocate 2GB reliably on 32-bit Windows. If you have multiple processes, they can use 4GB in total. There’s also the issue of hardware eating a hole in the 32-bit memory map, so even if you have 4GB, you can’t use all of it in 32-bit Windows land.

I wouldn’t say 64-bit Windows became mainstream until, shit, recently. Last five years? When “most” machines ship with 4GB RAM standard and Windows 7 standard.

as of June 2010 46 percent of Windows 7 installs were 64-bit [in Steam stats]

edit: I was curious so I looked up 2016 hardware survey stats from Steam:

Windows 10 64-bit – 48%
Windows 7 64-bit – 29%
Windows 8.1 64-bit – 9%
Windows 8 64-bit – 1%

Not bad, about 87% of gamers are 64-bit today!

In this case, Qualcomm hardware is lagging behind. Far, far behind. The Android software is mostly fine.

I spent a lot of years working on microprocessors with all the delightful joy of memory mapped hardware. I know that you have declared yourself an expert because you have a Wikipedia article above, but this place is full of serious programmers and your condescending attitude is tiring. Out of curiosity, how many years have you spent in a programming position where you interacted daily with memory mapped i/o, memory mapped registers, etc?

I wasn’t arguing any of these things. You are still not reading what I write. I wrote 4 GB as basic shorthand because 3.4xx whatever is an annoyance. You tried to counter with how 4 is ridiculous and 2 was the right answer while citing yourself claiming 3 was best. I am still not talking about main stream systems or sweet spots in computer construction. My point is clearly that at the time 3.4 or 2 or 4 or whatever the hell number you want was the right number at the time because other things held back taking advantage of having more. Now that barrier is gone I think general consensus is 4 is not enough and 8 is probably the sweet spot for most users now. It is the same reason why an older, less powerful CPU is more than enough for gaming now because GPUs hold them back. Eventually the GPU bottleneck and the graphics API bottlenecks will get out of the way and a current high end CPU will be a better gaming choice than a current midlevel option. There’s a difference between current sweet spots and technological reasons for having ‘enough’.

You keep making up arguments for me so that you can post some random stats to counter them. I said 2005 as an approximate start of the consumer 64 bit processor era, I didn’t claim it was some saturation point. If you actually read my post you would see I said 64bit CPUs with a 32bit OS was becoming a common config so I have no idea how you have conflated this to me saying 64bit OSes or how 2016 install bases are a counter point to my comments.

I have not said that a single 32 bit application in 32 bit windows has more than 2GB of addressing space, I have not said that everyone in 2005 used 64 bit windows, (I noted the 64 bit intel cpus started to roll out), I would venture to say that I have way more experience with many of the things that you are trying to bring up out of the blue as attacks claiming that I am ignorant of them.

Qualcomm phone hardware is clearly pretty crappy, I have had the fantastic joy of working with Qualcomm devices in the embedded space. The fact that it has more cores is not why it is crappy.

On this we can agree: Qualcomm CPUs are currently awful. Adding 8 of them on a chip does not make them any less crappy, nor does it magically unlock 8 CORE BLAST PROCESSING™.

It’s the same reason a low end dual core Skylake i3 will outrun the Xbone or PS4 handily. Two fast cores > eight slow cores. All day, every day… unless you’re encoding videos or some other extremely niche edge case that is extremely amenable to parallelization.

Where can I get 8 CORE BLAST PROCESSING™!?!??!?? I need it now

I never noticed this but it is quite true. My iPad Air 2 is much faster than my HTC smartphone for all the apps that run on both platforms. At least, the apps I use.

I don’t care that much, but I wish the HTC actually worked as, you know, a phone. Can’t hear anything on it and unless I’m in the car the other party usually has trouble understanding what I’m saying as well.

The new Pixel XL seems to be decent as a phone, even though it’s made by HTC…I’ve only had a couple of calls on it, but it seemed loud and clear.

Update: full review has now hit!

TL;DR:

In the end, the Pixel XL is a decent enough phone, but it is not the ultimate Android phone that people were likely hoping for. It fails to stand out in a crowded market and cannot claim to be the best in any single category; at best it is a jack of all trades. This is a serious problem for a phone that is positioned as and priced like a flagship phone. It also does not help that it’s missing support for microSD cards and wireless charging (it does support the USB Power Delivery specification for 18W fast charging), features that are available on the Galaxy S7 edge. There’s also no environmental protection against water and dust, which both the S7 edge and iPhone 7 Plus include. Even its exclusive software feature, Google Assistant, should be available on future Android phones. In the end, the Pixel XL is a Nexus phone with another name. It still delivers a pure Android experience and timely software and security updates, but is that enough to justify its flagship price?

Sounds like my assumptions were pretty spot on. Happy to remain a 5X owner.

Yeah that’s a depressing summary. :(

ok so I’ll never buy Apple, I was always a Commodore guy… It turns out Sprint still has their upgrade plans, so I guess I’ll hold out for the S8, Feb/March 2017? I am still using a rooted S2, which I’d keep but I’m having troubles updating it to Cyanogen mod and it’s running ICS which is I take it vulnerable…

Anything earlier than KitKat 4.4 (released Oct 2013) is utterly discontinued and probably super vulnerable to exploits. KitKat is security updates only.

Here’s a fun benchmark for you guys

http://browserbench.org/Speedometer/

Go ahead and post your results. Show me yours and I’ll show you mine 😉

As covered here
The truth about traditional JavaScript benchmarks · Benedikt Meurer

And here

Specifically (remember, this is coming from a dude who is literally responsible for the V8 JS engine inside Chrome)

We have been adding ways to Chrome infrastructure to measure performance from inside the browser, which gave a lot of important insight, but is not portable across browsers. From these investigations I am concluding that Speedometer might be the best proxy for real world performance that is currently available. We will follow up with a detailed blog post about the real world performance effort soon.

cough as I was saying

But it’s a browser benchmark measuring web applications.

I’ll get you started.

Core i7 Skylake Chrome – 185
Core i7 Skylake Edge/FF – 64
2016 iPhone 7 – 111
2016 Macbook (Safari) – 63.3
2016 Macbook (Chrome) – 88.1
2013 iPhone 5s – 32
2014 iPad Air 2 – 48

I always thought apple devices didn’t support Javascript? That you had to use HTML5 instead? I don’t own one, so I didn’t know. Apparently they do, and that’s the most important metric? Because most apps use Javascript or something like that?

You’re thinking of Flash.

But don’t let that distract you from Wumpus yet again refuting an argument that no one is making.