I have a weird problem and I'm hoping someone can help me.
I work for a small archive and we purchase motion picture film scanning equipment from a local builder. I build my own computers, nothing too fancy, just something with a lot of storage, mainly, to handle the digitization files and enough speed and RAM to do basic editing (running Windows 7 Pro). I also make sure these computers exceed the specs required for the scanners. The problem I've had, though, is with some new scanners we bought. Just some background on the scanners, they are very simple. Basically a camera is connected via USB to the computer. The scanner's hardware triggers the camera to take pictures and feed those into a file that is built by the scanning software.
The scanner has two speed settings: fast and slow. Everything works fine when scanning at the slow speed, but if I put it on the fast speed, after capturing about 100 feet of film it starts dropping frames like crazy. It is almost like the computer can't handle the data coming through from the camera. When I mentioned this to the scanner builder, he said that there's something weird about custom built computers and that they tend to have this problem. If I use a Dell or the like, the problem doesn't happen. I happened to have a Dell laptop in the office and, sure enough, on the fast speed the scanning software has no problem and doesn't drop a single frame.
My question is, am I doing something wrong on my custom builds to cause this problem? It makes no sense to me. All the drivers are up to date. This problem even happens when connected to a USB 3.0 port, which should have more than enough bandwidth to handle the incoming data. I can't help but think there must be a setting, or maybe even a jumper on the motherboard, that I'm not getting right that is causing these problems.
If anyone has any kinds of ideas to this weird issue, I'd greatly appreciate it.
I work for a small archive and we purchase motion picture film scanning equipment from a local builder. I build my own computers, nothing too fancy, just something with a lot of storage, mainly, to handle the digitization files and enough speed and RAM to do basic editing (running Windows 7 Pro). I also make sure these computers exceed the specs required for the scanners. The problem I've had, though, is with some new scanners we bought. Just some background on the scanners, they are very simple. Basically a camera is connected via USB to the computer. The scanner's hardware triggers the camera to take pictures and feed those into a file that is built by the scanning software.
The scanner has two speed settings: fast and slow. Everything works fine when scanning at the slow speed, but if I put it on the fast speed, after capturing about 100 feet of film it starts dropping frames like crazy. It is almost like the computer can't handle the data coming through from the camera. When I mentioned this to the scanner builder, he said that there's something weird about custom built computers and that they tend to have this problem. If I use a Dell or the like, the problem doesn't happen. I happened to have a Dell laptop in the office and, sure enough, on the fast speed the scanning software has no problem and doesn't drop a single frame.
My question is, am I doing something wrong on my custom builds to cause this problem? It makes no sense to me. All the drivers are up to date. This problem even happens when connected to a USB 3.0 port, which should have more than enough bandwidth to handle the incoming data. I can't help but think there must be a setting, or maybe even a jumper on the motherboard, that I'm not getting right that is causing these problems.
If anyone has any kinds of ideas to this weird issue, I'd greatly appreciate it.