Page 6 of 18 FirstFirst ... 4567816 ... LastLast
Results 51 to 60 of 178

Thread: Ouya Game Console Performance Is Disappointing

  1. #51
    Join Date
    Feb 2013
    Posts
    219

    Default

    Quote Originally Posted by bnolsen View Post
    If you were right then java should not have been rejected outright on the linux desktop. It most certainly has and for VERY GOOD reason.
    It was rejected? Any link or details? I use Java every day on my Linux system. Many Java devs prefer Linux.

    Quote Originally Posted by bnolsen View Post
    I've also worked enough with java to know that a few hand picked benchmarks does NOT necessarily indicate reality.
    I don't see any reason to suspect that web site of "hand picking" benchmarks or having some Java bias/agenda.

    Quote Originally Posted by bnolsen View Post
    And there's the elephant in the room: Java applications hog memory like there's no tomorrow.
    I believe there is a big difference between the desktop HotSpot JVM and Android's Dalvik runtime on this. The HotSpot JVM is designed for servers where there is often lots of RAM.

    I suspect that there may be large differences in fixed overhead, but if you load the same volume of data into a common data structure in Java vs another language, I don't see any reason why the Java language or runtime would impose a significantly higher memory cost.

    Quote Originally Posted by bnolsen View Post
    And I'll toss in multithreaded scaling for extra measure (my current specialty).
    Care to elaborate? Java does very well with multithreaded code and asynchronous code. Scala is much more elegant for async work but has similar performance characteristics AFAIK.

    Quote Originally Posted by peppercats View Post
    JVM and Dalvik are completely different beasts.
    Yes, of course. The runtime VM is probably more directly related to performance than the programming language is. However, two people were citing "Java", not Dalvik, as being the primary problem with Ouya performance, so I felt it appropriate to cite more readily available desktop Java performance info.

  2. #52
    Join Date
    Sep 2012
    Posts
    568

    Default programming languages speed

    Please, stop comparing nonsensical stuff.

    - The speed of C/C++ programs is incredibly tied to the quality of the compiler (the thing that transforms words into machine instructions). For a big program, you can have 10x (probably even 100x) performance increase between a 15 year old compiler and a modern one (just look at the llvm and gcc phoronix comparison tests to see how it improves between versions).
    - The speed of Java programs is incredibly tied to the quality of the virtual machine (the thing that transforms words into machine instructions). For a big program, you can have 10x (probably even 100x) performance increase between a 15 year old VM and a modern one.

    Modern Java VM have very efficient (on some platforms) for some years. Dalvik java has not been so for most of android's life. It is getting better.
    Performances of languages (aside assembly) are not set in stone, and not even always relevant.

  3. #53
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    4,728

    Default

    @DanLamb

    You posted throughput benchmarks, when the topic was latency.

    To use a game analogy, it does not matter if the average fps is 200 when it dips to 10 once per second.

  4. #54
    Join Date
    Feb 2013
    Posts
    219

    Default

    Quote Originally Posted by curaga View Post
    You posted throughput benchmarks, when the topic was latency.
    The official thread topic is actually disappointing Ouya sales, but it's branched out in many directions. I know the difference between throughput and latency, and wrote a ton of stuff on latency. I believe several people were arguing that Java was terrible in general and not just for variable latency reasons.

    @erendorn, I generally agree with what you said.

    I'd also note that many or even most of Android + Ouya games have their core game loop stuff in C/C++/Renderscript

  5. #55
    Join Date
    Feb 2013
    Posts
    252

    Default

    Quote Originally Posted by curaga View Post
    @DanLamb

    You posted throughput benchmarks, when the topic was latency.

    To use a game analogy, it does not matter if the average fps is 200 when it dips to 10 once per second.
    this is the worst part of using a GC language for making games, you get a lot of FPS spikes from garbage collection.

  6. #56
    Join Date
    Feb 2013
    Posts
    219

    Default

    Quote Originally Posted by peppercats View Post
    this [latency] is the worst part of using a GC language for making games, you get a lot of FPS spikes from garbage collection.
    Do you have any evidence of this at all?

    Have you played the JDK version of Quake 2? http://bytonic.de/html/jake2.html

    I have any I don't see FPS spikes. Also, clearly people who have benchmarked this see that the Java version actually achieves slightly higher FPS rates than the official C version. I find it hard to believe that the Java version has significant FPS spikes and still manages to average a much larger average FPS speed.

    I've played other GC language games like Wakfu (Java) and Bastion (C#) and I didn't subjectively notice FPS problems.

  7. #57
    Join Date
    May 2011
    Posts
    1,295

    Default

    Quote Originally Posted by peppercats View Post
    this is the worst part of using a GC language for making games, you get a lot of FPS spikes from garbage collection.
    How much garbage collection would there be during a game? I'm assuming objects won't be created and thrown away on a regular basis.

  8. #58
    Join Date
    Feb 2013
    Posts
    252

    Default

    Quote Originally Posted by DanLamb View Post
    Have you played the JDK version of Quake 2? http://bytonic.de/html/jake2.html
    I'm sorry, but a port of a 16 year old game is not a good representation of how strong something is, I guarantee you that the game is going to be entirely graphics-card limited.

  9. #59

    Default

    [QUOTE=DanLamb;326576]Android is generally very competitive with iOS on various OpenGL/CPU/GPU benchmarks even with the Dalvik VM system.[/QUOTE}

    Lulz. Obviously not real-world test scenarios then. But keep clinging to those benchmarks all you want - they were specifically intended for the non-developer type to gush over.

    Apple's hardware often scores higher, but that is more of a testament to Apple's engineering excellence and better hardware. Benchmarks aren't perfect, but I trust them way more than I trust you.
    Honestly son, I don't care who you trust - I am the guy in the trenches who has to make this stuff work. And Android has no hope whatsoever (EVER) of ever matching the same performance in RetroArch that I get on other platforms like -

    * Wii
    * PlayStation3
    * Gamecube
    * Xbox 360
    * Xbox 1
    * iOS
    * Blackberry
    * PC

    I would like to put you on notice and remind you that you are talking to a guy who has ported to all those platforms and more - you are not dealing with some purported 'developer' that 'just' made some kind of low-key game, got it working on Android, got 6fps and then patted himself on the back and told himself 'OK, this is going to run just swell over two iterative generations or so. So honestly, I think I know a thing or two about how these platforms stack up - far better in fact than your 'pie in the sky' benchmarks would ever indicate to you.

    Honestly, your 'pie in the sky' benchmarks traverse you down one optimal render path and that is it. They don't tell you jack all about CPU performance, driver issues, shitty blitting (another Android trait), shitty garbage collector stalls, and on and on.

    Audio latency does seem to have been a legitimate issue. Android 4.2 claims to address this depending on hardware support from manufacturers. From the official Android dev site:
    http://developer.android.com/about/v...elly-bean.html



    Also, docs on OpenSL ES for pre Android 4.2:
    http://mobilepearls.com/labs/native-...les/index.html

    It doesn't sound like Android's audio latency problems had anything to do with Davlik, virtual machines, or Java.
    Thanks Captain Obvious for pointing out stuff to me that I already know. However, it seems you don't even read the stuff that you are selectively quoting and Googling - notice the caveat there that says 'THIS RELIES ON HARDWARE SUPPORT'.

    And guess how 'swell' that hardware support is up to now? Exactly one Google Nexus device and as for the rest - shit all - that is what.

    And guess what? Even the '40ms' figure they are now holding up as an 'ideal target' is pretty crappy compared to the kind of audio latency you can get with CoreAudio on iOS. So even on that one special phone where the 'fast SL mixer path' goes into effect - it is still shit compared to iOS.

    Android is taking performance and latency very seriously.
    Like hell. Dianne Hackborn quite honestly doesn't give a shit about performance, or anything that gets in the way of her precious 'framework' and the way it ought to be 'properly used' (note - some ideological claptrap and 'good design' principles according to a couple of demented Java ideologues).

    Look at the Project Butter 60 FPS locked interface improvements in 4.1. You can also see tons of more detailed performance analysis and improvements on the Android dev site.
    At this point you pretty much sound like an end user. 'Project Butter 60fps locked interface' - here is the deal son - most of the time your precious Android devices don't even have screens capable of a 60Hz refreshrate - as in the case of the Galaxy SIII - as in the case of the Galaxy Note 2 - as in the case of the JXD S7300 - and we can go on and on. Now, surely, those first two devices I mentioned there represents a pretty big slice of the overall market, does it not?

    Also - the fact that those devices 'lie' about their refreshrates means that you can not even rely on the SDK-supplied 'GetRefreshRate' function - in 75% of all cases it reports a bogus number (because '60Hz' looks better on some techsheet than '58Hz' or '50Hz' now does it?

    Now it gets even worse when you want to heuristically detect the refresh rate and time your game/emulator correctly according to it - good luck doing that with a garbage-collecting OS.

    I would also point out that even console games use some higher level programming language like Lua in some sandbox type runtime to process game logic and it doesn't seem to have disastrous performance consequences.

    Ouya as a whole... The Tegra 3 is just lower performance hardware than what you get with a PS3/360 console.
    Thanks for that Captain Obvious.

    However, you might just want to get yourself clued up a bit more - it isn't just that it can't compete with a PS3/360 -

    IT CAN'T COMPETE WITH AN XBOX 1/WII.

    How do I know? I develop for the aforementioned boxes, and I can also test the same software across all these devices.

    And guess what?

    Yoshi's Island running with SNES9x Next runs at fullspeed on a Nintendo Wii?

    An 1.5/1.6GHz ARMv9 dual core Cortex A9 Android phone however? 56-55fps and lovely audio crackling - and the GC stalls add some more lag ontop of that.

    Now you go and rationalize that all you want to yourself while an iPad 2 plays that same game just fine. Seriously, stop drinking the Java/Google koolaid and admit that the entire platform sucks. I have the facts, I have the ultimate benchmark (RetroArch) that proves that this platform sucks because I can run the very same software with the very same frontend codebase on all these platforms - if a platform doesn't deliver on relatively tight syncing, it is reflected in its runtime performance. And guess where Android stands in all of that? Right at the bottom of the foodchain - a Cortex A8 Android tablet performs even worse than a Nintendo Gamecube, thank you very much.

    Now please - next time you try to 'correct' somebody, make sure you aren't talking to somebody who already went there, done that, and has gotten the T-shirt.

  10. #60

    Default

    Quote Originally Posted by BO$$ View Post
    A lot of games are graphics-card limited so there is enough room for GC without a performance hit. Also Android uses concurrent GC so there are only two small pauses not a big hitch. Since most cpus are multicore now it can be assumed the GC will be run on a core with less than 100% use so it won't hit the rendering thread or whichever thread uses most of a core.
    You are making up excuses for the inexcusable - garbage collector stalls are INEXCUSABLE for any kind of realtime app.

    Don't come at me with this 'it will be spread across a second core and therefore you won't notice the stalls' - because I have experienced the exact opposite of that in tons of runtime apps.

    You are making up bad excuses for a terrible operating system (Android) and a multibillion dollar company (Google) that doesn't know what the hell they are doing, nor do they care - as long as they can sell lots of advertising revenues and 'Facebook' your life in their own little way, it doesn't really matter to them that Android is a terrible operating system with terrible runtime performance (which is exactly what it is today).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •