You would think Nvidia would want consumers to buy their video cards and use them no matter what.
If someone wants to buy and use a Nvidia graphics card as a dedicated PhysX card they should be able to? Right? WRONG!
Nvidia refuses to share space with AMD video cards in your desktop PC.
Early on this was perfectly acceptable you could install an AMD Radeon and use it as your graphics card and Nvidia card as a dedicated PhysX card.
But as of the last year or maybe 2 Nvidia put a stop to this. They don’t want you using dedicated hardware PhysX unless you have a Nvidia graphics card to.
It’s plain silly people are still using Nvidia hardware in their computers either way.
AMD created TRessFX which simulates hair and they gladly allowed it to be used with Nvidia hardware.
Do not see any reason to not allow end users to get the best of both worlds.
They want a edge to gain consumer sales and figure if you want hardware PhysX then use Nvidia. That is good thinking but also not so good they are pushing away another consumer those who want Nvidia hardware just for PhysX.
No they won’t be buying the top end video cards even gamers that are Nvidia only camp don’t spend $800+ on their top end cards. So they are not loosing any sales on the top end hardware in that sense.
Nvidia needs to open up hardware PhysX to AMD graphics card users again.
It’s not hurting your sales Nvidia.
These few months I have played more games on PC than I have in the last 2 years between PC and consoles.
Not hard to do I can seriously count how many games I’ve played in the last year previous to September on 1 hand and it wouldn’t be all 5 fingers.
But the line up has been amazing thought PC gaming may have died at some point in the last 2 years. Maybe it did?
It has inspired me to upgrade both my sons PC and my own. And even upgrades have not been anywhere on my to do list until this past few months.
I’ll start with the new DOOM this game brought me back from the land of no PC gaming.
Fast paced, great graphics plain great to play.
And allows users to use the new api known as Vulkan which gives great speed improvements over the old OpenGL option that still is usable in game.
On a gaming rig this game is on the INSANE list a must play it’s really that good.
Telltale games aren’t exactly change the gaming world with graphics or an new genre.
But they do render comic book stories great. And Telltale has some great stories to tell indeed.
The great games that hooked me are The Walking Dead :Michonne, Tales from The Borderlands, The Wolf Among Us, Game Of Thrones & Batman The Telltale series.
All of these games are a blast to play and have great stories to boot and the voice acting is top notch.
The Wolf Among US
Gears Of War 4 while it departs from the badasses of the original series it is still great fun to take cover and kill baddies. Not to mention it uses the Unreal 4 engine!
Plus if you pre-ordered this game you got it for both PC and XboxOne not to mention came with map packs and digital versions of the XB360 games Gears Of War, Gears 2,Gears 3 & Gears Judgement.
Not to bad for $60 way to go Microsoft!
It looks insane running max detail 4k.
And also Battlefield 1 which releases October 21st which I did pre-order because every Battlefield has been amazing since Bad Company 2.
If you purchase digitally in EA’s Origin the Early Enlister Deluxe for $79.99 or the ultimate edition for $129.98 you get extra map packs and are able to play early on the 18th. Get to play 3 days earlier than when purchasing the standard edition for $59.99 is it worth the extra $20 for 3 days early with 3 extra map packs? Maybe not for everyone but for me HELL’S YEAH!
Check out these insane screenshots WWI=prepare for shell shock!
Enter November! Yet another amazing game will be up and ready to play on the 11th.
If you haven’t played Dishonored then shame on you it is one cool game.
But for those who have played the first get ready for Dishonored 2.
Abandoning the Unreal engine and now using Arkane’s new “Void” engine.
Don’t fret the engine appears to handle Dishonored 2 with great detail.
Dishonored 2 is being developed by Arkane Studios and published by Bethesda Softworks.
Here are some screens.
Not to forget Call Of Duty Infinite Warfare which will be release on the 4th of November.
I liked the last Call Of Duty but not anticipating the new one for whatever reason I’ll wait and see and hope for the best.
Recently the API Vulkan was fully used in the new DOOM game on PC.
Many think Vulkan is just an upgraded Mantle API from AMD it’s partially true.
AMD donated their Mantle API to the Khronos group which stated that this new API would be the new OpenGL which that name was discontinued and is now called Vulkan. Much cooler name anyhow especially for Trek fans.
Vulkan is intended to provide a variety of advantages over other APIs as well as its spiritual predecessor, OpenGL. Vulkan offers lower overhead, more direct control over the GPU, and lower CPU usage.Intended advantages include:Vulkan API is well suited for high-end graphics cards as well as for graphics solution present on mobile devices (OpenGL had a specific sub-set for mobile devices called OpenGL ES).
In contrast to DirectX 12, Vulkan is available on multiple modern operating systems; like OpenGL, the Vulkan API is not locked to a single OS or device form factor. As of release, Vulkan runs on Windows 7, Windows 8, Windows 10, Tizen, Linux, and Android.
Reduced driver overhead, reducing CPU workloads.
Reduced load on CPUs through the use of batching, leaving the CPU free to do additional computation or rendering than otherwise.
Better scaling on multi-core CPUs. Direct3D 11 and OpenGL 4 were initially designed for use with single-core CPUs and only received augmentation to be executed on multi-cores. Even when application developers use the augmentations, the API regularly do not scale well on multi-cores.
OpenGL uses the high-level language GLSL for writing shaders which forces each OpenGL driver to implement its own compiler for GLSL that executes at application runtime to translate the program’s shaders into the GPU’s machine code. Vulkan drivers are supposed to ingest instead shaders already translated into an intermediate binary format called SPIR-V (Standard Portable Intermediate Representation), analogous to the binary format that HLSL shaders are compiled into in Direct3D. By allowing shader pre-compilation, application initialization speed is improved and a larger variety of shaders can be used per scene. A Vulkan driver only needs to do GPU specific optimization and code generation, resulting in easier driver maintenance, and smaller driver packages in theory (as GPU vendors still have to include OpenGL/CL).
Unified management of compute kernels and graphical shaders, eliminating the need to use a separate compute API in conjunction with a graphics API.The part listed above was taken directly from Wikipedia.
Vulkan has the same things DirectX 12 is offering but is not locked into a specific OS and from what I have seen performs fantastic.
And the API works with all flavors of graphics cards Nvidia ,AMD, Intel.
From what I have seen at least with DOOM is Vulkan works great with AMD cards expect excellent frame rates.
My experience with Vulkan on DOOM has been amazing I can crank it up with Nightmare settings well above 1080p with 8xAA and it never even gets a hiccup.
Hoping the new Battlefield 1 will be implementing the Vulkan api I look forward to many, many games supporting it.
Mantle was good but from what I see with Vulkan it’s many jumps ahead of what Mantle was doing.
Nvidia was always the better card for OpenGL. AMD finally has a API that favors them.
AMD’s Crimson drivers are excellent and in the next year things will get even better.
The new drivers have already shown great improvements in AMD cards they are performing above similar Nvidia cards that at one point were faster than AMD cards.
Hopefully AMD will keep pace and do something huge in the CPU department for desktops very soon. Like a 5.0ghz 16 core or more CPU;-) We can hope. Or I can anyhow.
I like to keep my options open so I never limit myself to 1 specific brand and say I am a die hard.
But the last few years AMD has been the brand I generally purchase CPU or Video cards.
Main reason for this is bang for the buck.
As far as CPU’s go I have only purchased AMD since they broke the 1ghz barrier back in 2000.
Sure I’ve purchased a few Intel CPU’s for HTPC builds but for my every day use Computer I choose AMD CPU’s.
Intel CPU’s are great but a top tier Intel CPU will cost you more than your mainboard ,Ram,Bluray drive & SDD or plain HDD put together.
Want to game well Intel plays best with Nvidia and like Intel Nvidia charges top dollar for their top tier graphics cards $700+ so for a Intel CPU and Nvidia card alone will cost you $2 grand for the 2 parts.
2 grand will build you one kick ass AMD PC.
The biggest issue I’ve had with AMD graphics cards is since they decided on Dual Bios cards.
Don’t get me wrong it’s great being able to flash your vcard to test other FW or even unlock you $200 card to the $350 card is great.
The dual Bios comes with different settings depending on which way the switch or button are.
1 will be a lower clock speed the 2nd will be a higher clock speed which means louder fans and more heat.
I have owned the HIS 290x and Now the Sapphire Radeon Fury Tri-x Nitro the HIS has the switch and the Sapphire has a button to switch bios versions.
And neither of them had any info on which bios version was attached to the switch so it’s just try and fly.
I went through the instruction booklets on both brands and not even a mention of the Dual Bios or what the switches are and do. There is not even a mention of the switch or button on the cards.
So it’s contact the manufacturers or search the nets and hope you get accurate info from forums.
The HIS 290x the switch clicked towards back of the case is the Turbo/Nitro Bios. Switch clicked towards front of the case is lower clock speeds better temps and quiet fans.
The Sapphire Fury Tri-X Nitro The button Clicked so light is on is Nitro setting this is max clock speeds lots of heat fans can get noisy. Button clicked Light off basic clock speeds less heat quiet fans.
These settings aren’t something you can see as you use your PC unless you have a setup that gives real time temps of the GPU as you use it.
This is something that should be explained some peoples setups are not good with heat dispersion so a max clock may just ruin the graphics card or the PC.
These manufacturers really need to include something in the paperwork to explain this.