Search the Community
Showing results for tags '64-bit'.
Found 2 results
Hey guys, I've been test driving a 64-bit release of Firefox for awhile now, I wanted to have a discussion about the stability of the 64-bit release vs the 32-bit one. For the most part, things have been smooth... I don't use that many extensions, but those I did in the past work flawlessly. Of course, Windows allocates more memory (probably around 1.5-2x as much) compared to the 32-bit version, but that's to be expected and I can't complain as I don't have a shortage of RAM. However, I do have complaints about the stability of the 64-bit version. I've been using Firefox for... sheesh, at least ten years and the only time I've had issues like this was back when version 3.x was released along with the horrible plugin-manager that ate up memory (due to a leak) like a beast. I signed up for the beta of 4.0 as soon as I could, and things have been good since then. Well, until now. Now I might have some unreasonable expectations, so here's a glimpse of my web-browsing: -Lately about 30 tabs open constantly -Close to 100 tabs constantly when I'm working on a project -A Firefox session usually lasts as long as my computer uptime, which can be between 7-14 days. -Average Youtube tabs: 5, Netflix tabs: 1, Pandora: 1. -I use Adblock Plus, with only a few sites whitelisted (like our forums). Now with this usage in the past, pages would still load snappy, videos would playback with no problem, and I could expect around 1.6-2gigs of RAM usage at the end of the week. Right now with 17 tabs open, Firefox is using 1.2gb of RAM which isn't a problem... but it's performing like it's hit a RAM ceiling. Videos buffering for awhile, pages not loading properly (usually a refresh will fix it), and sluggish general performance. Anyyyywayyys... TL;DR: IMO The 64-bit version of Firefox performs worse than the 32-bit version despite Windows allocating more resources. I'm curious whether any Firefox users here have noticed a difference in performance between the 32-bit and 64-bit releases. I know, I should probably just restart the browser more often... but it's sad IMO going from 14-days of stable uptime to restarting Firefox every day. Plus, more often than not I have information in forms that are partially filled-out, so I can't close the browser without losing that work. I'll probably be switching back to the default mainstream 32-bit release, but I am curious what others have to say before posting a bugzilla report.
DICE to require 64-bit OS for some 2013 games, that Windows ME box in the den isn't cutting it We're entering a world of mainstream 64-bit computing -- whether we like it or not. Just weeks after Adobe started requiring 64-bit Macs for CS6, DICE's Rendering Architect Johan Andersson has warned that some of his company's 2013 games using the Frostbite engine will need the extra bits as a matter of course. In other words, it won't matter if you have a quad Core i7 gaming PC of death should the software be inadequate; if you're still running a 32-bit copy of Windows 7 come the new year, you won't be playing. The developer points to memory as the main culprit, as going 64-bit guarantees full access to 4GB or more of RAM as well as better virtual addressing. Andersson sees it as a prime opportunity to upgrade to Windows 8, although 64-bit Vista and 7 (and presumably OS X, if and when Mac versions exist) will be dandy. Just be prepared to upgrade that Windows XP PC a lot sooner than Microsoft's 2014 support cutoff if you're planning to run the next Battlefield or Mirror's Edge. source