[b][/b]
[i][/i]
[u][/u]
[s][/s]
[code][/code]
[quote][/quote]
[spoiler][/spoiler]
[url][/url]
[img][/img]
[video][/video]
Smileys
smile
smile2
spook
alien
zunge
rose
shy
clown
devil
death
flash
sick
heart
idee
frage
blush
smokin
mad
sad
wink
frown
crazy
grin
hmm
laugh
mund
oh
rolling_eyes
lil
oh2
shocked
cool
[mail][/mail]
[pre][/pre]
Farben
[rot][/rot]
[blau][/blau]
[gruen][/gruen]
[orange][/orange]
[lila][/lila]
[weiss][/weiss]
[schwarz][/schwarz]
langusiii
Posts: 9 | Last online: 09.10.2017
Date registered
08.09.2017
Sex
not specified
    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.18.2017

      Regarding the 2nd issue reported before (RAID driver Errors), I've just realized that the Errors occur when I "RESTARTED" Windows but not when I "SHUT IT DOWN" and then "STARTED".

      Speculating, this could be related with Win10 functionality of treating all disk as removables. The booting first 5 Errors would be related to the 4 partitions and the extended one on the 4th. Then the 6th 10 mins later to a timeout for the controller. Shame that I don't know how to see why the driver is reporting those Errors to Windows.

      Rgds,

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.18.2017

      Finally seems that my system is stable again running Win10.1703 on an EGVA NVIDIA nForce 680i SLI MPC55 with an SanDisk SDD (system disk) and 2 x Seagate HDD in RAID 0 and with PNY NVIDIA GeForce 7900 GS Graphic Card.

      I found out that there were two independent problems:
      1. I had 2 x GeForce 7900 GS in SLI configuration using NVIDIA driver 309.08 (last update). I unplugged one card, took away the SLI setup and voila! No more BSOD. Simplest approach always seems to be the best. My conclusion is that the driver working with Win10 is not very comfortable working in the SLI setup and is causing some memory corruption. Further test would apply but for now is working and stable with just one card.

      2. The nvraid driver (in-box) as well as all your versions of nvrd64 drivers (9.99.09 & 11.1.0.43) generate Error entries (not Warnings) on the Event Viewer (5 times on booting, 1 time 10 minutes after that and eventually one more) but they seem to be harmless. Although is not nice to read those errors, I've been monitoring the RAID drive during all the week and all reports are Healthy with no errors at all, even on a quite hard video editing use. Options are to get use to or deactivate those errors through regedit.

      When I get some free time back, I would test with both cards again (feels pity having the second card sitting useless on top of the case) and I will be back to report it here just in case somebody needs it, but regarding the RAID setup I wouldn't pursue further analysis. If somebody finds out why Windows arise those errors and how to solve it I would be more than happy to test them.

      Googling on "nvraid event id 11 windows 10" you can read some other cases with the same conclusion as mine.

      Thanks for all your help!

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.14.2017

      Hi Fernando,
      Tried driver v6.99... didn't work. When I solved the signature (were signed by NVIDIA but included your cert), I couldn't boot in again, not even in safe mode, so I restored Windows from scratch.
      So starting with a plain & fresh Win10 1703 up-to-date (plain as Windows leaves, not even the graphic drivers), I tried:
      - In-box v10.6.0.24
      - v11.1.0.43 for orig scsidev mod+signed by Fernando
      - v9.99.0.9 for orig scsidev mod+signed by Fernando
      - v6.99 for orig scsidev mod+signed by Fernando (again couldn't boot after this)
      - Cold unplug the raid disks.

      An the result was the same:
      - During booting process 5 Error entries on Event Viewer with Event ID #11 for nvraid / nvrd64 (depending on In-Box or ByFernando drivers)
      - 10 mins after boot 1 more Error entry on Event Viewer with Event ID #11 for nvraid / nvrd64
      - Tested with some intensive disk use tool as a "robocopy" or "sfc /verifyonly" and after a while running, my mouse / kb start to crawl (I mean moving veeeeery slooowww).
      - Normal TaskMgr use for CPU, Memory or Disk during crawling.
      - No pattern detectable on Event Viewer entries (besides WindowsUpdateClient running when wasn't suppose due to pause and deferred ..., but also happens without this process running)
      - Eventually between 20 and 60 mins Windows crash in a BSOD with a VIDEO_TDR_FAILURE but reading the memory.dmp this was the last timeout triggered (before time-update, network, win32, etc has risen their timeouts because everything was so slow).
      - Analyzing the memory.dmp see that NVidia network driver tries to write a non-zero memory page. Memtest86+ ran with no problems after 8 extensive passes 20+ hrs.

      So even when those Event Errors shouldn't be there and they were the reason for my posts here, but after disconnecting the RAID but getting the same, I'm not sure if the problem is in fact Win10 with my entire old hardware.

      Reading on this post, Kelly_41m narrow this error to a) memory slots setup 2) heat on mobo or 3) weak power supply. I will explore the last two, before give up and return to Win7.

      Any more ideas to try?

      Thanks in advance.


      https://social.technet.microsoft.com/For...n10itprogeneral

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.13.2017

      Yes we know... I know. Electronics manufacturers don't want to support more than 5 years backwards as well as Software manufacturers... its just programmed obsolescence to generate trash and money...

      Anyways, I tried with your attached "64bit nForce SATARAID drivers v9.99.09 WHQL.rar" but I couldn't find a driver for my NVIDIA nForce RAID Devices (see screenshot because I'm not sure if I'm doing something wrong), and I know if I leave the with 11.1.0.43 will loss the RAID after booting.

      I also tried with the one in your OnDrive "64bit nForce SATARAID drivers v9.99.09 for orig scsidev mod+signed by Fernando.rar" and this did indeed installed well (see screenshot). Also a question here, when listing the options for changing the driver on Device Manager, each device (RAID Controler, Device's and SATA Controler's) listed like 6 drivers options for the same driver version (see on screenshot). My choice was the first one but If I choose a different one will it make any difference?


      However the results on event viewer were the same as with 10.6.24 (in-box) and 11.1.0.43: on booting I've have 5 errors on nvrd64, 10 mins later another one, and after a while copying files from the RAID to the other disk mouse and kb start to cwawl then BSOD with error VIDEO_TRD_FAILURE. So no luck yet.


      And forgive my ignorance, I saw that the difference between the "v9.99.09* WHQL" and "*v9.99.09 for orig scsidev*" drivers besides the signature, is in the [NVIDIA.ntamd64] section of the .INF file. What is the meaning of that section and entries?

      Any other ideas to give a shot?

      Thanks in advance.|addpics|gtb-4-1cb5.jpg-invaddpicsinvv,gtb-5-4913.jpg-invaddpicsinvv|/addpics|

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.12.2017

      Hi Fernando , thank you for your response!.

      Zitat
      Have you ever tried the NVIDIA nForce SataRAID drivers v9.99.0.9?


      Yes, but I used the one on your post#1 so I couldn't set that driver to the "NVIDIA nForce RAID Device" and I lost the complete RAID, not your OneDrive mods of post#146/*. So I'll give a try to those and I'll be back with results.

      Zitat
      Please check the HardwareIDs of that device


      I found on the online MS documentation that "Microsoft VHD Loopback Controller" is virtual hard drives controller used when you mount a ISO image among others. Yesterday I mounted and unmounted an ISO image just to see what was inside and then took the screenshot (reason because is disabled in gray), but now there is not such device, so it's not related with the RAID issues. Anyways I did mounted again and the HardwareID is {8e7bd593-6e6c-4c52-86a6-77175494dd8e}\MsVhdHba

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.12.2017

      Hi Fernando, thanks for your answer!...

      I've just have 3 storage devices plugged as the screenshots shows, 1) both Raid disk on SATA#1, 2) System SSD and 3) an xtra disk which is temporal manual mirror of the Raid on SATA#2 and all others are in gray which I presume is "disconnected" but shown because I "unhide devices". Anyways I'll follow your suggestion disconnecting 3) for next tests.

      Zitat
      1. Which device manages the listed "Microsoft VHD Loopback Controller"?


      I have no idea. On my first post screenshot it wasn't there, neither is now, but seems that it was early today when I took the previows screnshots. My first gues is an ISO image I mounted early today and second the OneDrive I 've just got rid off.

      Zitat
      2. Which is the DeviceID of your NVIDIA nForce Serial ATA Controller"?


      Sorry for my ignorance, if DeviceID is the same as HardwareID then
      PCI\VEN_10DE&DEV_037F&SUBSYS_C55E10DE&REV_A2
      PCI\VEN_10DE&DEV_037F&SUBSYS_C55E10DE
      PCI\VEN_10DE&DEV_037F&CC_010485
      PCI\VEN_10DE&DEV_037F&CC_0104
      If not please tell me where to find it.

      Attached the events #11 on nvrd64 at boot time.


      Thanks in advance.
      Rgds.


      |addpics|gtb-3-f8ea.jpg-invaddpicsinvv|/addpics|

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.11.2017

      Zitat
      Which nForce SATARAID driver version did you use


      64bit nForce SATARAID drivers v11.1.0.43 for orig scsidev mod+signed by Fernando

      Zitat
      which is the DeviceID of your NVIDIA nForce RAID Controller?


      DeviceID = HardwareID? ==>
      ACPI\NVRAIDBUS
      *NVRAIDBUS


      Discarded a memory problem: 8 x extended passes of memtest (20hrs run) says "no problem" as I presumed, since no problem for many years of Win7 until last weekend Win10 fresh install...

      I went through my notes and realized a couple of common factors on Event Viewer among crashes: 1) nvraid before and nvrd64 now, event 11 "The driver detected a controller error on ." x 4 or 5 entries after booting and 1 entry about when the crawling mouse & kb starts, 30 mins before the BOSD crash; and 2) WindowsUpdateClient has started some update that I didn't authorize (I hate the idea of some lab tech using me as beta tester and messing with my computer performance when I probably need it most, but it's the price... for now). It is probably something related with the raid controller driver, but I didn't realized WUC was running (even when Win10 System, pagefile.sys and TEMP are on the SSD not the RAID).

      Do you know the meaning for "Device nor Migrated" for the NVIDIA STRIPE Disk? it's because I'm using the MoBo RAID or something else?


      Any other ideas to try out?

      Thanks in advance and rgds.|addpics|gtb-1-008c.jpg-invaddpicsinvv,gtb-2-a9c7.jpg-invaddpicsinvv|/addpics|

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.10.2017

      Hi Fernando,

      Zitat
      The devices named "NVIDIA nForce RAID Device" (= members of the RAID array) are usually hidden. To show them listed within the "Storage Controllers" section of the Device Manager you have to hit "View" on the Device Manager menu bar and then check the option "Show hidden devices". For details look >here<.


      Of course I "showed hidden devices", but when using the drivers of post #1 on the "NVIDIA nForce RAID Device" says that those drivers didn't match my hardware. Using your drivers "orig scsidev" from the OneDrive did work and allow me to see the RAID back.

      Zitat
      You can find the answer >here<.


      Reading that post I think both are related when the RAID is your System Disk but using my modified or your modified scsidev. In my case the SSD is the System Disk and the RAID the file disk, so I think that post #146 or #311 should work the same to me, right?

      However this morning crashed again with the same errors as at the beginning: copying files (this time not involving the raid), some nvrd64 errors (before were on the win10 driver nvraid), start to crawl, and BSOD with VIDEO_TDR_FAILURE. The differences this time were: a) nvrd64 instead of in-box nvraid, b) the last event entry 30 minutes before crash was "Bad Memory Regions", weird, didn't have any problems on Win7 until last weekend and I haven't touch the mobo (just plugged the SSD on an existent wire), and c) Fricking WinUpdate was installing the Aug rollout (KB4034674) when it start to crawl, there is not event entry for successfully finishing but checking now says it's up-to-date, weird again.

      Will ran memtest intensive (takes like 8 hrs) just to discard memory problems and I'll be back tonight.

      Thanks in advance.

      Rgds,

      Javier (alias LangusIII)

    • langusiii has written a new post "NVIDIA: Optimized nForce Driverpacks for Vista/Win7-10" 08.10.2017

      Hi Fernando, Iíve found your forum just a few days ago, itís really impressive, congratulations!. Youíve been around for a while helping folks, so first thank you for your work to the community!!

      My old rig mainly dates from 2007 but itís still running as brand new so after years of Win7Ult x64 without problems, I decided to upgrade it to Win10Pro 1703 x64 with a new SSD from scratch to level it to the others computers at the office.

      Everything was smooth until I ran my robocopy backup from my 2 disk striped raid to another disk. After 20 mins running I got a crawling mouse and kb for like 40 mins (running with less than 40% of CPU & Memory use), to finnaly crash with a BSOD saying that VIDEO TDR FAILURE on nvlddmkm.sys. After restarting I found an entry on the event viewer 40 mins before the crash with the event 11 on nvraid. All robocopies lasting more than 20 mins, are crashing with the same process.

      So I believe itís not the graphic driver (2 x PNY NVIDIA GeForce 7900GS SLI) but the raid driver (EVGA nForce 680i SLI P/N: 122-CK-NF68-AR).

      Win10 installed the driver 10.6.0.24 for the raid, so I manually forced to 11.1.0.43 (first from NVIDIA package and then from your package) but Iím missing the entire raid after rebooting (even when itís still in the MediaShield bios, it's "disconnected" on Device Manager and MIA in Disk Management). The process I'm following is as you suggested, update the NVIDIA nForce RAID Controller, the NVIDIA nForce Serial ATA Controller's (3 of them), but I couldnít find a driver for the NVIDIA nForce RAID Device (2 of them) so that are staying on the in-stock 10.6.0.24 and I believe this could be the problem. Also tried 9.99.0.9 with the same result.

      So the questions would be if you share my reasoning and where to find the ďDeviceĒ driver.

      Attached find some screens for reference.

      Thanks in advance!.
      14:30 US-CT

      [Update]
      Shouldn't the Hardwared Id's of the Disk drives / NVIDIA Stripe matchs the Storage controlers / NVIDIA nForce RAID Device (see 2nd screen?)
      19:00 US-CT

      [Update2]
      After a some hours, I figured out how to get the raid back using your 11.1.0.43 x64 mod, tomorrow I'll run more disk intensive test to see if I got rid of my original problem also, the BSOD.
      The remaining question on your OneDrive there are two versions "mod" and "orig" "scsidev", which would be the difference? (booting or not booting raids respectively?)
      And again, thanks for all your work on this forum, before I arrive here I was about to give up on Win10 and return to 7...
      23:40 US-CT

Recipient
langusiii
Subject:


text:
{[userbook_noactive]}


Xobor Forum Software von Xobor