Search the Community
Showing results for tags 'Nvidia'.
-
Hello everyone, bad news, my RTX 2060 super, which served me for about 4 years, told me “farewell my love” and showed me artifacts on the screen, the video chip fell off, to say that I was upset is to say nothing. (NOT MY IMAGE) I got approximately this type of screen today when I started the computer in the morning, now the video card does not output images to the monitor at all. I have already managed to extract it. I thought maybe some master's could help me by resoldering the chip or something like that, but they told me that I would just take it to the trash heap :'/ or sell it for parts to those in need Due to the not particularly large budget, the need to pay for university, and the computer is constantly needed, I will have to buy something much less expensive, and at the same time very urgently, because I could not wait. In general, the budget is modest, so I have to choose between GTX 1650 (ddr6, 12000 MHz, 4gB) or Amd RX 6400 (ddr6, 16000 MHz, 4gB). On the one hand, RX is better, but it has a much weaker bus and memory bus bandwidth, only 64 vs 128 in 1650, BUT RX has more GPU frequency and memory grequency. In general, the question is pressing, the price seems to be the same, but there is a lot of doubt. Who has used such devices, what are your impressions? What's the best takeaway from this? I only have a couple of days to think about it, I can’t wait any longer, but I can’t choose either. Objectively, few people can explain, but choosing GTX because it’s NVIDIA, well, that’s not an argument. Before the RTX 2060, I had a RX550, which saw a lot of things and was retired. In this matter, I am not particularly whimsical, but for this amount I would like to get the conditional maximum that I can at least for a year in advance. It’s clear that later I will buy something newer, but it will be in a year, or maybe a year and a half, I can’t say yet.
-
As you all know, I've been an avid Radeon supporter for many years. I've bought nothing but Radeon GPUs over the course of the past nine years, and today that streak came to an end. It's not that their hardware isn't competitive. Nor has it been waiting for a decent architecture. I can live without HairWorks, ray tracing, and CUDA acceleration. Only 71 fps in ET? No problem. Microstuttering? I can live with it. Their drivers though. Oh man. What a steaming heap of ----. And that's why I'm parting ways with Radeon GPUs. And blue screens only paint a tiny fraction of the picture. For six months, I thought my GPU had failed. It had all the telltale signs of failure: black screens, artifacts, fans ramping up to 100% with a system hang, you name it. I used a backup card for many months until I absolutely needed my 480 for some work. I only needed it to work for 15 minutes. It's now been running for over a month without a problem. That's right, their drivers were so bad they mimicked hardware failure. (expensive hardware failure, at that) And yes, I threw the book at it (DDU, rolling back, flashing vbios, etc). Now I've been in the market to upgrade for some time. But I didn't really see any competition, and I wanted to continue to support AMD. So I waited. For the disaster that was Vega, and I waited for Navi. And seven months down the line, only half of the issues with Navi have been fixed. Now people are saying to wait for Navi 2. But this time, I'm finally fed up. I need something that just works. So I'm going back to Nvidia. Big whoop right? I make a long post just to say that? Well, no. Because I also can't recommend Radeon products any more, they've lost my trust. And if you're building a new PC, or considering an upgrade with a Radeon GPU, I just have two words: Good luck.
-
AdoretTV came out with a educating video again: Yes, we all know about AMD-s rebranding of Tahiti GPUs from HD 7xxx to RX9 3xx and some other things other companies have done also but this is just interesting view on Nvidias history and how they have put break on innovation of technology (and still doing- asking to disable async compute altoghether). Especially interesting in the light of this: https://www.tomshardware.com/news/intel-discrete-gpu-2020-raja,37289.html
-
The new Nvidia Geforce Experience is f***ing retarded. I don't want a social networking platform on my Driver/Game optimizing program. And the shittiest part is you need to log in with one of your already existing social media accounts, or create an account through them. Desperate attempt by Nvidia to access your information, and advertise their name. Suck a **** Nvidia. http://www.tomshardware.com/news/geforce-experience-beta-ui-update,32196.html
-
http://www.pcgamesn.com/crysis/nvidia-class-action-lawsuit-settlement
-
- 2
-
NVIDIA has just announced its new graphics card that will be based on the Pascal architecture, the NVIDIA GTX1080. However, NVIDIA did not reveal how much faster this GPU is from its current high-end GPU, the GTX980Ti. Thankfully, NVIDIA has released some official benchmarks between the GTX980 and the GTX1080, so it’s time to see the performance benefits that PC gamers can expect from this brand new GPU. NVIDIA claimed that the GTX1080 was faster than two GTX980s in SLI. However, and according to the official charts, that’s not true. As we can see below, the GTX1080 is 1.7X and 1.8X faster than the GTX980 in The Witcher 3 and Rise of the Tomb Raider, respectively. According to NVIDIA, The Witcher 3 was tested in 1440p on Ultra settings. So let’s take a look at PCGamer’s benchmarks that compares the GTX980 with the GTX980Ti. The performance gap between the GTX980 and the GTX980Ti is almost 20%. This basically means that if NVIDIA’s figures are anything to go by, the GTX1080 is 50% faster than the GTX980Ti in The Witcher 3. What’s also interesting here is that we are most likely looking at a 1.75X performance increase compared to the GTX980. NVIDIA claimed that it was 2X, but that won’t be the case in real tests/benchmarks (as showcased by both Rise of the Tomb Raider and The Witcher 3). The performance difference – according to a lot of benchmarks – between the GTX980Ti and the GTX980 is around 20-30%. Therefore, the GTX1080 is – according to NVIDIA’s figures – up to 50% faster than the GTX980Ti in extreme situations (we expect an average of 40% performance boost). So there you have it everyone. The big question now is whether AMD will be able to compete with NVIDIA. Another big question is whether the GTX1080 will perform better in DX12 than all previous NVIDIA cards. Like it or not, the green team faces major performance issues in DX12, something that will undoubtedly help AMD in the long run. The fact that NVIDIA did not say anything at all about Async Compute or DX12 is also something that worries us. Here is an extreme example: Quantum Break runs 30% faster on AMD’s hardware. Assuming that the GTX1080 is 40% faster than the GTX980Ti, AMD will have to basically offer a GPU with 10% better performance than the FuryX in order to gain its performance crown in that particular game. Source http://www.dsogaming.com/news/report-nvidia-gtx1080-is-up-to-50-faster-than-the-gtx980ti/
-
The world smallest gaming cylinder, the Vortex measures only 10.5 inches tall but can blast away bigger systems with an unlocked Intel Core i7 6700K processor, up to 64 GB of DDR4 memory, SSDs in Super RAID 4, and dual NVIDIA GeForce GTX 980 GPUs in SLI. MSI's newest tower is also engineered for superior upgradability, expandability, and compatibility with a variety of connectors, including Thunderbolt 3 that supports up to 40 GBs of data transfer rate. Minimal in footprint but maximum in performance, the Vortex features MSI's 360° Silent Storm Cooling technology to ensure extreme coolness under the most intense gaming sessions. MSI's cooling system creates a vortex-shaped wind path by capturing heat from the bottom of the unit and circulating it upwards, resulting in superior thermals and acoustics. The 450W 80 plus Gold power supply is also built into the Vortex to minimize space to completely power the system and any peripherals you want to connect. "The revolutionary Vortex gaming tower shatters the misconception that bigger systems deliver more power and performance," says Andy Tung, president of MSI Pan America. "The groundbreaking system maximizes space and heat dissipation to unlock the power of its components for truly astronomical performance in a Subwoofer size." MSI's Vortex can simultaneously support up to six video outputs for intense multi-tasking and multi-screen experiences, is 4K ready, delivers lag-free gaming with Killer DoubleShot X3 connectivity, and an immersive multimedia experience with Nahimic Military X Gaming, the first audio technology adopted by the French Army to vividly reproduce audio in their aircraft training simulators. Perfect as a gaming desktop in the bedroom or as an entertainment station in the living room, MSI's Vortex can be customized and controlled using the Dragon Center Dashboard and App. Through the software, users can customize lightning preferences, monitor system performance, launch utilities and apps, fine-tune the system with personalized profiles, and more. source: guru3d.com The Vortex is now available in USA, starting at $2,199.99. tax free. It shall also be avaible in Europe at something like 2300 euros (with taxes). But not with the two 980GTX, but with two 960GTX, and not 64GB of RAM, but 16GB. Overwise this high config described will be around 3600 US dollars, which would be around 4000 euros taxes included.
-
DirectX 12 is a really hot topic as we’ve got a number of games supporting it. And Oxide developer Kollock has revealed some interesting new information about this new API. According to Kollock, the latest WHQL drivers from NVIDIA support Async Compute; a feature that NVIDIA’s GPUs did not initially support. As Kollock claimed: “I can confirm that the latest shipping DX12 drivers from NV do support async compute. You’d have to ask NV how specifically it is implemented.” Async Compute is a feature that was previously support only by AMD’s GPUs, so it will be interesting to see what NVIDIA has done via its drivers, and whether the results are as good as those of AMD’s. For what is worth, Async Compute is still disabled in Ashes of the Singularity for NVIDIA’s GPUs. “Async compute is currently forcibly disabled on public builds of Ashes for NV hardware. Whatever performance changes you are seeing driver to driver doesn’t have anything to do with async compute.” Another interesting thing is that Steam’s overlay is working in DX12 by hooking the present call. DX12 and UWP were criticized for not supporting overlay programs. However, as both Kollock and NTAuthority stated, Steam’s overlay can work in DX12. As Kollock said: “I’d also point out more basically that the Steam overlay works by hooking the present call, in DX12.” And here is Steam’s overlay working, courtesy of NTAuthority. Windows Store games don't support overlays, they say... Kollock also talked about the rumoured list of issues that come with DX12 and Windows 10 Store games. According to Oxide’s developer, the only issue right now is with the refresh rate and that’s when vsync is disabled. In case you are not aware of, the rumoured list of the DX12 and Windows 10 Store issues is this one. 60 FPS cap 60Hz forced Gsync and FreeSync not supported Windowed mode is forced (store only) .EXE encrypted (store only) No true access to game files (store only) Little to no 21:9 or multi monitor support (store only) 3rd Party FPS readers and FPS limiters won’t work (For most of the titles) No .ini modding (store only) No reshade support No SweetFX support Limited key binding (store only) Limited driver intervention due to App encryption (store only) Slower roll out of patches (as they must pass WHQL per patch – store only) Poorer support for mGPU configurations At the mercy of Developers to implement desired features. Regarding this list, Kollock had this to say: “Uhm, no. Almost none of these are true. I have no idea what Windows Store apps require, but Steam is 90% of the Game market, and the rest is GMG, GoG, and Origin. The main difference between D3D12 and D3D11 is that when you create a swap effect you have to use FLIP rather then DISCARD. That has implications on refresh, but you can also do the refresh of the monitor. Practically speaking, this only matters if vsync is disabled, which is generally done only for performance testing.” Kollock also added: “As I’ve said before, the only difference between a D3D11 and a D3D12 title at present is that when creating a swap chain in D3D12 you must use the swap effect DXGI_SWAP_EFFECT_FLIP_DISCARD, in D3D11 you can use DXGI_SWAP_EFFECT_DISCARD. This has implications when vysnc is disabled, but shouldn’t when vsync is enabled which is not only what we default to, but what we recommend for players. More or less, this becomes a very subjective argument over whether tearing or dropping frames is better when you don’t want to run at a refresh of your monitor. However, most people in the industry would concur that you really want to running with your monitors refresh rate.” And as for Freesync/G-Sync, Kollock said that they should work, unless Freesync/G-Sync support has been hacked in some in games in order to work. “The limitation is with vsync off, FreeSync and Gsync work by changing the refresh, so I don’t think this change would impact it at all except they may have hackeddadadadad. It should [work], I don’t think this change would impact freesync and gsync except perhaps to break things temporarily if they are hacking them to work. I don’t know enough of how they’ve made freesync/gsync to work at a system level.” What is made crystal clear here is that not only Microsoft needs to work with the current DX12 issues, but also AMD and NVIDIA. Both the red and the green teams need to do some driver changes in order to support features that were previously working fine in DX11 (but for some strange reason are broken in DX12, especially if there were hacks involved into making these features work in the first place). Source http://www.dsogaming.com/news/directx-12-async-compute-supported-in-latest-nvidia-drivers-steam-overlay-works-in-dx12-mode/
-
Hello everyone, Since i visited this forum and have seen some computers from other members. i thought i also share my computer with you guys. What does it got: OS: Windows 10 CPU: Intel Core i5-3570k @ 3.4GHz, 3401 MHz RAM: 16 GB Motherboard: ASRock Z77 Exterme 4 Graphic: Nvidia GeForce 660TI/OP Storage: 2x Samsung SSD 850 EVO (120GB), 2x Western Digital HDD (1TB) Mice: Corsair Gaming m65 RGB Keyboard: Corssair Vengeance k60 Headphone: Razer Tiamat 7.1 If i missed something that you would like to know, please ask
-
Nvidia has announced the Jetson TX1, a 64-bit ARM board. Yeah, we know: ARM processors are for phones and tablets, not full PCs. But before you switch off, pay some attention to the TX1: it seems like the gap is rapidly closing between ARM and x86, which has been the go-to for computing for decades. According to Phoronix, Nvidia is even saying that the Jetson TX1 matches up to an Intel Core i7 6700K in some aspects of its performance. When the Jetson TK1 came out last year, it appeared that developers were using it as a production board. Now, after Nvidia announced the Tegra X1 (currently being used in tech such as Nvidia Shield), it is going to be releasing a Jetson based on that, which the company claims will have two to three times the performance of the TK1. According to AnandTech, the Jetson TX1 comes as a standalone module with a separate carrier board for I/O. Attached to the board is 4GB of memory, a 16GB flash module, a 2x2 802.11ac plus Bluetooth wireless radio, and a Gigabit Ethernet controller. The complete TX1 system is on a chip that's a bit smaller than a credit card. The separate I/O carrier board contains all sorts of connectors that will be useful for developers. The JTX1 runs on Nvidia's Maxwell architecture, and contains 256 CUDA graphics cores offering over 1TFLOPs of performance. Nvidia is targeting deep learning, computer vision, and GPU computing with this new board. This means that compute-intensive of an application can be passed off to the GPU, rather than bogging down the CPU. Basically, applications will run more efficiently, which is good news for gamers. You won't just be able to slap a Windows install on it right away, but it could pave the way for cheap, all-in-one Linux gaming systems. Or even the return of Windows RT, but actually good this time? Nvidia is currently taking pre-orders for $600, with the Jetson TX1 and the full development kit expected to start shipping on November 16. The standalone TX1 module will go on sale in Q1 2016, and will be available for $300 (in quantities of 1,000). Source http://www.pcgamer.com/nvidias-arm-jetson-tx1-can-actually-compete-with-an-intel-i7-6700k/
-
Nvidia's new driver, version 358.87 , is out just in time for the release of Call of Duty: Black Ops III this week. As well as BlOps 3, it has also added "Game Ready" support for Anno 2205, and the new War Thunder patch with GameWorks. Nvidia has also said that it will be adding Game Ready driver support "on or before launch day for the top titles this holiday season." The game's included in this are: -Civilization Online -Fallout 4 -Just Cause 3 -Monster Hunter Online -Overwatch -Just Cause 3 -RollerCoaster Tycoon World -StarCraft II: Legacy of the Void -Star Wars: Battlefront -Tom Clancy's Rainbow Six Siege -War Thunder Release 358 has added SLI support for ShadowPlay and GameStream in Windows 10, as well as Stereo support for DirectX 12 SLI and 3D Surround. ShadowPlay is Nvidia's screen recording utility, and GameStream is streaming tech that allows you to pipe a PC game to a GameStream-capable box connected to your TV over WiFi. The new driver has also added a number of bug fixes related to performance drops and crashes for Windows 10, 8, and 7. Known issues on Windows 10 still include stuttering and flickering in the newly released version of Batman: Arkham Knight. If you're using GeForce Experience and have handed over your email address, then you'll have a chance to win one of the prizes Nvidia is offering this holiday, from games and graphics cards to the Shield Android TV box, part of Nvidia's incentive to join their mailing list. Source http://www.pcgamer.com/nvidia-promises-game-ready-drivers-for-a-dozen-holiday-games/
-
When Nvidia launched the 980M last October, it claimed that the laptop GPU could hit 75-80% of the performance of the desktop card. That was impressive compared to the mobile GPUs of a generation or two back, but not as impressive as what Nvidia’s done now: fit the entire, uncompromised desktop 980 GPU into gaming laptops. Starting early October with six laptops from MSI, Gigabyte, Asus and Clevo, the GTX 980 is making its way into laptops, with all 2048 CUDA cores intact. And it’s overclockable. To show off the full-on 980 doing its thing in laptops, Nvidia demoed several laptops side-by-side with desktop machines to compare benchmarks. In Shadow of Mordor, Tomb Raider, and a couple other benchmarks, the laptop system was able to turn in nearly identical scores—at worst, about 5% off what the desktop machine delivered. In those cases, it wasn’t even quite a fair fight, since a laptop CPU was up against a more powerful desktop CPU. Some of the laptops were actually equipped with desktop parts and delivered dead even performance. In one case, 3DMark actually turned in identical scores down to the point. The GTX 980 will be able to deliver 7 Gbps of memory bandwidth compared to the 980M’s 5 Gbps. Whereas the mobile 980s previously only had 3 phase PSUs, the new cards will be outfitted with 4-8 phase power supplies, which will vary by laptop. Every system that ships with a 980 will have a custom-tuned fan curve to keep the card cool, but it’ll also ship with some sort of overclocking tool, like MSI Afterburner or Gigabyte’s OC Guru. How overclockable the 980 will be will naturally vary from laptop to laptop, as different systems will have different thermal constraints. You can bet that the Asus GX700, that watercooled beast of a laptop we wrote about last week, will be able to push the 980 to its limits. All of this performance requires the laptop be plugged into AC power, however. Without the extra juice from the wall socket, the 980 will deliver roughly equal performance to the 980M. Here are the ones coming in the near future: -Asus GX700 -Clevo P775DM-G -Clevo P870DM-G -Gigabyte Auros RX7Y4 -MSI GT72 -MSI GT80 Nvidia also told that like with the 980M, there will be SLI configurations of the full-size 980 in laptops, too. That’s likely as much GPU muscle as you’re going to find in a laptop until sometime in 2016, when Nvidia has a new generation of cards to roll out. Source http://www.pcgamer.com/
-
Both games included with GeForce GTX 980 or GTX 970. Nvidia has announced that purchases of its GeForce GTX 980 or GTX 970 graphics cards will include a free copy of The Witcher 3: Wild Hunt and Batman: Arkham Knight. In an official statement, the hardware manufacturer said both games benefit from Nvidia GameWorks, its middleware software suite that brings together rendering, visual effects, and physics development modules. “Both Batman: Arkham Knight and The Witcher 3: Wild Hunt raise the bar for graphics, rendering two very different open worlds with a level of detail that we could only dream of last decade," the company said. “On the PC, each game is also bolstered by Nvidia GameWorks--increasing fidelity, realism, and immersion." The Witcher 3 release date was recently delayed by three months but is now expected to ship on May 19. Meanwhile, the Batman: Arkham Knight release date is set for June 23. Publisher Warner Bros. recently responded to criticism over the pricing of its $40 (£32.99) season pass by detailing some of the content that will be offered to purchasers, such as a prequel campaign featuring Batgirl, character skins, and additional story missions. Source http://www.gamespot.com/
-
Nvidia's given its gaming-focused Shield tablet a connectivity boost in the form of 4G LTE, as well as bumping the storage capability. And you'll have it in time for Christmas. The graphics company has put the 32GB 4G version of its 7-inch tablet up for pre-order today at the not-unreasonable price of £299.99. It'll land in your sweaty gaming hands on September 30. Unfortunately, that price doesn't include the Shield Wireless Controller, which will set you back a further £50. Living the stream The original Shield slate dropped back in July, complete with Nvidia's 192-core Tegra K1 processor and the ability to stream PC games directly from the desktop. Like its predecessor, the 4G model carries the same home-grown processor, backed by 2GB of RAM, and will also let you use the integrated Twitch support to broadcast your virtual exploits online. You'll need to hook yourself up to a data tariff to take advantage of the new connectivity. But since Nvidia's LTE Shield ships unlocked, you've got your choice of carrier. via: http://www.in.techradar.com/news/mobile-computing/tablets/Nvidia-opens-pre-orders-for-its-LTE-Shield-gaming-tablet/articleshow/42711458.cms (sorry about the topic prefix)
-
Nvidia Corp has sued rival chipmakers Qualcomm and Samsung Electronics, accusing both companies of infringing its patents on graphics processing technology. The U.S. chipmaker vies with Qualcomm in the business of providing chips for smartphones and tablets. It said on Thursday that Qualcomm and Samsung had used Nvidia's patented technologies without a license in Samsung's mobile devices, including the just-launched Galaxy Note 4 and Galaxy Note Edge. Nvidia said Samsung devices made with graphics technology from Qualcomm, Britain's ARM Holdings and Imagination Technologies infringed on its patents. "They're using our technology for free in their devices today and they're shipping an enormous number of devices," Nvidia Chief Executive Jen-Hsun Huang said on a conference call. Samsung, the world's biggest smartphone maker, said it would fight Nvidia's lawsuits. "Following a thorough review of the complaint, we will take all measures necessary against NVIDIA's claims," the South Korean firm said in an e-mailed statement, without elaborating on the measures it could take. Nvidia did not say it is suing Imagination or ARM but it did say it is asking the U.S. International Trade Commission to prevent shipments of Samsung devices containing ARM's Mali or Imagination's PowerVR graphics architectures, as well as Qualcomm's graphics technology. Graphics technology from Imagination is also used in Apple Inc's iPhones. Asked whether Nvidia plans to sue Apple since it uses Imagination's technology, Huang declined to comment. "Today we’re focused on Samsung and Qualcomm, and we continue to have productive conversations with a lot of other companies out there,” he said. Nvidia made its name developing leading graphics technology for high-end personal computers but has struggled to expand into smartphones, a market dominated by Qualcomm. Nvidia discussed its patents with Samsung for "a couple" of years before deciding to take legal action, Huang said. Last year, Nvidia announced that it planned to license its graphics technology to other companies. But it has not announced any licensing deals since then. Nvidia said its lawsuits were filed at the U.S. District Court for the District of Delaware and at the U.S. International Trade Commission in Washington. Such dual filings are typical of infringement lawsuits since the district courts can award financial damages and the Commission cannot. At the same time, the Commission can more easily ban infringing products from the U.S. market. ARM declined to comment. Qualcomm did not respond to a request for comment. Source: http://in.reuters.com/article/2014/09/05/nvidia-qualcomm-samsung-elec-idINKBN0H005420140905
-
Looks like promising technology if its made accessible on both nVidia and AMD platforms. As if at moment monitors run with fixed frame rate and GPU pushes more or less FPS out there occur tearing or with v-sync stuttering. G-sync promises to eliminate both by make modified LCD screens run in pace with GPU on variable frame rates. In more detail in TomsHardware: http://www.tomshardware.com/reviews/g-sync-v-sync-monitor,3699.html Though to me the tearing and stuttering seems to be more of a game engines problem as I played both Dead Space 3 and Remember Me which both are built up similarly, tight corridors mostly which don´t need so much GPU power with views on wider space occasinally. DS3 ran with up to 150 FPS and didn´t notice any tearing etc, well ofc its hard to notice graphics problems when you are always alert for monsters creeping out;) But Remember me ran also on 150 FPS but then dropped to 30 with lag freezes with and without v-sync.
-
Intel and Nvidia in talks about merge – rumour.
DJ aka GDR DJ posted a blog entry in DJ's Rumor Mill
Intel and Nvidia in Talks About Merge – Rumour. Intel May Acquire Nvidia, Jen-Hsun Huang May Lead New Company Intel Corp. and Nvidia Corp. have restarted talks about possible acquisition of the latter by the former, a recent market rumour claims. To make the rumour about a merge between the two companies that would not result into significant synergies even more weird, it is believed that Jen-Hsun Huang, a co-founder and chief executive of Nvidia, may become the head of the new company. Huge companies, such as Intel and Nvidia, usually meet with each other to discuss potential collaborations, strategic unions, cooperation or even mergers and acquisitions. For example, after Advanced Micro Devices acquired ATI Technologies in 2006, Intel proposed to acquire Nvidia. The companies held a number of meetings, but Nvidia refused the take-over, just like it rejected acquisition by AMD earlier that year as it wanted Jen-Hsun Huang to become the chief executive officer of the new company. Intel then proposed Nvidia to license its GeForce graphics cores and integrate them into Core i-series “Ivy Bridge” and “Sandy Bridge” processors, which Nvidia also turned down. As the chief executive officer of Intel – Paul Otellini – plans to retire in May ’13, the company is naturally looking for a decent replacement. Since Jen-Hsun Huang can clearly be a candidate for the position, Intel’s management recently held a meeting with Nvidia’s management to discuss potential acquisition once again, according to a report from BrightSideOfNews web-site, which cites a new market rumour. The developments of relationship between the two companies are kept secret, so nobody will know anything about the proceedings until the deal is finalized. While the discussions may be in place, it is hard to imagine that the transaction will actually take place. There are numerous reasons for that, including vast differences in corporate culture; little synergy between the two companies and potentially loads of excessive assets and/or people; inability to negotiate about the price that would satisfy both parties. Intel and Nvidia did not comment on the news-story. source -
Hey guys, I bought a Nvidia 3d vision 2 kit several months ago and use a monitor with stable 120fps, which means that I could play ET with 60fps in 3d if there was a possibility to enable the 3d mode. Does anyone of you know if enabling 3D in the engine of enemy territory is possible? Isn't it unreal engine 3? 'Wolfenstein: Enemy Territory' isn't listed in the compatible-game-list of nvidia, but Unreal Engine 3 and Unreal Tournament is. regards
-
Today I had to weirdest lag problem I ever had. First of all my ping was rather unstable (well, going up from 40ish to 58) This being when standing still. But here comes the unplayable part: As soon as I start moving the ping went to 160, while fps stable at 125 and I start lagging like crazy (headache as result) This happened on nq1, nq2 and to a lesser degree on jay4 (jay4 still 'playable') So I assumed this isn't a server problem, but a problem at my end (if not I should have post this on tracker, sorry in that case ) Did a winmtr to nq1: Does somebody has an idea what can causing this weird lag problem?
-
NVIDIA GeForce GTX 660 Gets Detailed Although we still have no accurate information about actual clock speeds, we finally have a first insight of some basic specs you would probably expect from such card. User PHK is a VIP member of Expreview forums. He is a recognized source of many leaks. Unfortunately, he doesn’t mention GTX 660 name itself, but we may assume that he is referring to such mid-range card from NVIDIA. If the translation is accurate, then we might expect a very short PCB (his post mentions number 4, as if those were inches, but that would be less than GT 640, so I don’t find this as actual length). What we are seeing here, is a new NVIDIA’s strategy in designing cards. It’s definitely easier to develop a new PCB and do some basic limitations to the GK104 GPU than creating a new one from scratch. As PHK mentions further, GeForce GTX 660 will have 2GB 192-bit memory. Card will be equipped with dual-DVI, DisplayPort and one HDMI output. Card would be powered by 4-phase PWM design. What he also says is ‘single 6′. This could probably refer to the quantity of SMX clusters — 6 SMX are equal to 1152 CUDA cores. And finally total dissipated power is reported at 100 Watts. Final specification of GeForce GTX 660 would like this: 28nm GK104 Kepler GPU 6 SMX clusters 1152 CUDA Cores 1.5 or 2GB of GDDR5 memory 192-bit memory interface Dual-DVI, DisplayPort and HDMI output TDP of 100W If the leak from Acer is correct, then we should expect first GeForce GTX 660 cards in a week. On the other hand, if KitGuru is right (huge overstock of GTX 570 delaying a launch of 660), then we shall wait until August. source
-
448 cores make a difference Geforce GTX 560 has been with us since January and even the GeForce GTX 560 Ti was launched in late January. Since that time, we barely saw any action in Nvidia's performance segment, but this is about to change. Enter the Zotac GeForce GTX 560 cores with TI 448th TI plain GTX 560 cards have 384 cores, while this new card comes with 448 cores, only 32 short cores from Geforce GTX 570 that packs 480 shaders. This is not ave The memory on this new GTX 560TI uses a 320-bit wide bus, while the reference GTX 560 TI has to make do with 256-bit. The Zotac card packs 1280MB of memory that is, Which should be more than enough for most users. Since the card is limited edition it looks more like a slightly under locked GTX 570 than a GTX 560TI on steroids and we believe that the chip behind this black and orange cooler is GF110 rather than GF114, but this is something that we need to double check. We do not know the price or launch date, but we expect to see it very soon. Enjoy the picture and we have no plans to remove it upon request. Source launch date Source DJ
-
- 1
-
- officially launch
- 448 cores
-
(and 3 more)
Tagged with:
-
Nvidia and AMD released new graphics drivers to help performance on BF3, don't forget to update!
About Us
We are glad you decided to stop by our website and servers. At Fearless Assassins Gaming Community (=F|A=) we strive to bring you the best gaming experience possible. With helpful admins, custom maps and good server regulars your gaming experience should be grand! We love to have fun by playing online games especially W:ET, Call of Duty Series, Counter Strike: Series, Minecraft, Insurgency, DOI, Sandstorm, RUST, Team Fortress Series & Battlefield Series and if you like to do same then join us! Here, you can make worldwide friends while enjoying the game. Anyone from any race and country speaking any language can join our Discord and gaming servers. We have clan members from US, Canada, Europe, Sri Lanka, India, Japan, Australia, Brazil, UK, Austria, Poland, Finland, Turkey, Russia, Germany and many other countries. It doesn't matter how much good you are in the game or how much good English you speak. We believe in making new friends from all over the world. If you want to have fun and want to make new friends join up our gaming servers and our VoIP servers any day and at any time. At =F|A= we are all players first and then admins when someone needs our help or support on server.