Electronics-Related.com
Forums

Is this Intel i7 machine good for LTSpice?

Started by Joerg November 2, 2014
rickman wrote:
> On 11/3/2014 5:09 PM, Joerg wrote: >> rickman wrote: >>> On 11/2/2014 5:28 PM, Jeff Liebermann wrote: >>>> On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote: >>>> >>>>>>> One catch. LTspice saves its preferences to: >>>>>>> C:\windows\scad3.ini >>>>>>> which has to be writeable. The fix is to use the >>>>>>> -ini <path> >>>>>>> command line switch, which will: >>>>>>> Specify an .ini file to use other than %WINDIR%\scad3.ini >>>>>>> <http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm> >>>>>>> >>>> >>>>> I need to note this somewhere. Writing to the Windows directory is a >>>>> *very* bad idea. >>>> >>>> It was standard procedure in Windoze 3.1, where almost all >>>> applications dropped pick_a_name.ini files in the C:\Windows\ >>>> directory. >>> >>> Yes, and Windows 3.1 crashed on a regular basis for about any reason >>> whatsoever just like 95, 98 and ME. >>> >>> MS has been telling developers since Win2000 and maybe since NT to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>>> put data files in the Windows or Program Files directories. Many chose >>> to ignore this which wasn't enforced until Vista and became one of the >>> things everyone loves to hate about Vista. >>> >> >> Maybe. But for us users only one thing counts: That stuff works. > > Do you build your stuff so that if the user connects a different > computer it craps out? No, you design your interfaces *correctly* so > that it works now and it keeps working when some peripheral piece that > should have no impact is changed out. >
I design it so that is also works correctly with legacy gear. In aerospace that can mean equipment from before you and I were born.
> These developers are designing crappy software and blaming it on MS. >
I've underlined the important part above. You might remember that there were operating systems before Win2k and that there was software written for those.
> >>>> I do have to admit it was handy as the files were easy to >>>> find and save. The new and improved versions of Windoze hide these >>>> config files in either the registry, or bury them 5 directory layers >>>> deep, where few can find them without specialized tools or inside >>>> information. >>> >>> Windows doesn't put anything from an app in the registry. That is up to >>> the app to decide. Getting to these directories is easy if they used >>> the right location, C:\ProgramData. Instead they continue to use >>> C:\Program Files and now with Win8 MS puts the files in the long path >>> name you list, but I believe they can be reached transparently through >>> the path C:\Program Files So the best of both worlds. >>> >>> If the app puts them somewhere else, don't blame windows. >>> >> >> If it was allowed in old Windows, isn't in new Windows, and there isn't >> a user selector about this then I blame Windows. > > "Allowed" meaning it didn't crap out, yes. "Allowed" meaning the > developers were not designing according to best practices, no. >
If it was not disallowed it was ok. Even today it's still this way. Personally I also think it was wrong but it is what it is. Many CAD programs still store their libraries in the program folder and, naturally, libraries are meant to be modified and added to.
> >>>>> I can't tell you how many developers do all sorts of >>>>> things they aren't supposed to under windows. That is the actual >>>>> cause >>>>> of many problems people have running older software under Windows. >>>>> They >>>>> don't listen to the people providing them with the OS! >>>> >>>> LTspice (aka SwitcherCAD) is a rather old program, with many of the >>>> traditions of Windoze 3.1 still present. If you don't like that, try >>>> running some of the various NEC antenna modeling programs, that still >>>> use the terms "card" and "deck" from the Hollerith punch card era. The >>>> common mantra is the same everywhere... if it works, don't touch it. >>> >>> These programs have been updated many, many times since Windows 3.1. >>> Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the >>> 3.1 tree which was ended when XP was released. Stick with the old >>> habits and blame yourself or your program maintainer. >>> >>> I use some open source Windows software that does the same crap and I am >>> very vocal about the cause and the fix for the problem. Few of the >>> developers are interested though. Now that 8 makes this (using Program >>> Files for data) work adequately they no longer have a need to change it. >>> >>> If you are relying on programming habits from over 20 years ago, then >>> you will have to stew in your own soup. >>> >> >> Easy to say for someone who probably never has to deal with beamfield >> sims and such. Bottomline there are programs some of us have to use >> where there is no alternative. Where the design teams have dissolved >> decades ago and some of the folks are not with us on earth anymore. My >> record so far is a chunk of software that was stored on an 8" floppy. > > Yeah, exactly. If you are that far back in time you man need to rethink > your approach. >
Any suggestions there? Should I tell my clients that this, that and the other project can't be done because only older design tools are available? Imagine you having a leak in the house. The plumber comes, takes a look and says "That wouldn't be up to code these days although it was back then. I suggest you build a new house and have this one torn down".
> >> Software does not automatically lose its value because it is over 20 >> years old. Or would you pour a bottle of 1995 Domaine Leflaive >> Montrachet Grand Cru [*] into the sink because it is old? > > Actually software does degrade with time as you are finding out. If you > can't find a platform to run it on, it has worn out. >
My software does not run out. My hardware does. I get more and more into pulse-echo stuff and esoteric switch mode designs where the amount of data to be crunched overwhelms the machine. BTW, when the in-circuit tester at one client conked out the culprit was the PC. Had an ISA bus. It was no problem to buy a brand-new machine at a very reasonable price that has an ISA bus. Except now they also have a CD drive in it.
> >> Talking about using legacy stuff, the aircraft guys are a bit more >> extreme there. This aircraft is going to celebrate its 80th soon and is >> used commercially: >> >> https://www.youtube.com/watch?v=jx11k1r1Pm8 >> >> [*] It runs north of $5k. Per bottle. > > Good, maybe your beamfield sim will run on it. :) >
:-) Example from a few years ago: A nasty alarm system problem had to be diagnosed. The software from that system was from the 80's. If I hadn't been able to run it really old software here at my lab I would have had to turn down that whole job. That would not be what I call smart. -- Regards, Joerg http://www.analogconsultants.com/
Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman:
> On 11/3/2014 5:47 PM, Joerg wrote: > > rickman wrote: > >> On 11/3/2014 4:28 PM, Joerg wrote: > >>> rickman wrote: > >>>> On 11/3/2014 3:51 PM, Joerg wrote: > >>>>> DecadentLinuxUserNumeroUno wrote: > >>> > >>> [...] > >>> > >>>>>> Since I cannot afford to put $1000 into a Titan video card, I > >>>>>> miss on > >>>>>> a few benchmarks with my $250 GTX650. > >>>>> > >>>>> > >>>>> I am not at all concerned about video because that's just used for > >>>>> static display and sometimes video conferencing. No games, no movies. > >>>> > >>>> If you are going for power, you need to have separate video memory or > >>>> the video eats memory bandwidth which is often the limiting factor on a > >>>> multicore machine. > >>>> > >>>> I haven't kept up with the hotrod machines these days, but I'd be > >>>> willing to bet you will get a lot better performance with multi-banked > >>>> RAM. Does this machine have two or more memory interfaces or just one? > >>>> > >>> > >>> No clue. But with SPICE the graphics action is very slow, just a wee > >>> progress of a few traces on an otherwise static screen. And you could > >>> even turn that off. > >> > >> You aren't grasping the concept. Video memory needs a sizable bandwidth > >> to *display* the image to the screen. All the data that goes out over > >> your HDMI cable is being read from memory *all the time*. You're a > >> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. > >> > >> This has *nothing* to do with drawing the images into graphic memory. > >> > >> The memory bank question will likely be more important than the number > >> of cores in the CPU. The guy who can run 16 threads has at least two > >> memory interfaces or it would be bogging down between 4 and 8 cores. > >> > > > > Well ... we did enter the 21st century. In this day and age graphics > > cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on > > board RAM. Others have more but that sounds sufficient. Also, there is > > no need to store 60 frames if the content is more or less static. > > I don't know if you are playing with me or what. Yes, that is what I am > telling you, get a system with separate graphic memory which means a > separate graphics chip. Many mobos have built in video with *no* video > ram. >
even so, unless you play 3D games that need massive amount of texture memory I doubt it matters much an I7 have something like +30GByte/sec memory BW depending on memory config refreshing two full HD monitors at 60Hz is only a few percent of that -Lasse
rickman wrote:
> On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote: >> Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman: >>> On 11/3/2014 4:28 PM, Joerg wrote: >>>> rickman wrote: >>>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>>> DecadentLinuxUserNumeroUno wrote: >>>> >>>> [...] >>>> >>>>>>> Since I cannot afford to put $1000 into a Titan video card, >>>>>>> I miss on >>>>>>> a few benchmarks with my $250 GTX650. >>>>>> >>>>>> >>>>>> I am not at all concerned about video because that's just used for >>>>>> static display and sometimes video conferencing. No games, no movies. >>>>> >>>>> If you are going for power, you need to have separate video memory or >>>>> the video eats memory bandwidth which is often the limiting factor >>>>> on a >>>>> multicore machine. >>>>> >>>>> I haven't kept up with the hotrod machines these days, but I'd be >>>>> willing to bet you will get a lot better performance with multi-banked >>>>> RAM. Does this machine have two or more memory interfaces or just >>>>> one? >>>>> >>>> >>>> No clue. But with SPICE the graphics action is very slow, just a wee >>>> progress of a few traces on an otherwise static screen. And you could >>>> even turn that off. >>> >>> You aren't grasping the concept. Video memory needs a sizable bandwidth >>> to *display* the image to the screen. All the data that goes out over >>> your HDMI cable is being read from memory *all the time*. You're a >>> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >>> >> >> with anything but a graphics-card integrated in the chipset that memory >> will be on the card itself, 1920*1080*24bit is less that 7MB > > It is not the amount of memory, it is the video bandwidth to keep the > monitor refreshed. Yes, it should be separate from the main memory or > you take a hit from the video accesses. >
It _is_ separate from the main memory on all modern graphics cards. The core circuitry of a PC has nothing to do with screen refresh. That was even the case with an old Tseng Labs card I had in the early 90's. -- Regards, Joerg http://www.analogconsultants.com/
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com>
wrote:

>Folks, > >Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated >that the Intel i7 is a really good processor for LTSPice. According to >this it looks like the 4790 is the fastest of the bunch: > >http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html > >So, what do thee say, is the computer in the Costco link below a good >deal for LTSpice purposes? > >http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html > >It's also available without MS-Office Home & Student 2013 for $100 less >but I found that OpenOffice isn't 100% compatible in the Excel area so >that sounds like an ok deal. My hope is that it can drive two 27" >monitors but I guess I can always add in another graphics card if not. > >Reason I am looking at these is that I absolutely positively do not want >any computer with Windows 8 in here and unfortunately that's what many >others come with.
Should be fine for ltspice. The 1600 dram makes a hugh difference. If you want a real screamer, then it's another story. Xeon class, a real number cruncher. Cheers
On 11/3/2014 6:56 PM, Joerg wrote:
> rickman wrote: >> On 11/3/2014 5:09 PM, Joerg wrote: >>> rickman wrote: >>>> On 11/2/2014 5:28 PM, Jeff Liebermann wrote: >>>>> On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote: >>>>> >>>>>>>> One catch. LTspice saves its preferences to: >>>>>>>> C:\windows\scad3.ini >>>>>>>> which has to be writeable. The fix is to use the >>>>>>>> -ini <path> >>>>>>>> command line switch, which will: >>>>>>>> Specify an .ini file to use other than %WINDIR%\scad3.ini >>>>>>>> <http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm> >>>>>>>> >>>>> >>>>>> I need to note this somewhere. Writing to the Windows directory is a >>>>>> *very* bad idea. >>>>> >>>>> It was standard procedure in Windoze 3.1, where almost all >>>>> applications dropped pick_a_name.ini files in the C:\Windows\ >>>>> directory. >>>> >>>> Yes, and Windows 3.1 crashed on a regular basis for about any reason >>>> whatsoever just like 95, 98 and ME. >>>> >>>> MS has been telling developers since Win2000 and maybe since NT to not > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ >>>> put data files in the Windows or Program Files directories. Many chose >>>> to ignore this which wasn't enforced until Vista and became one of the >>>> things everyone loves to hate about Vista. >>>> >>> >>> Maybe. But for us users only one thing counts: That stuff works. >> >> Do you build your stuff so that if the user connects a different >> computer it craps out? No, you design your interfaces *correctly* so >> that it works now and it keeps working when some peripheral piece that >> should have no impact is changed out. >> > > I design it so that is also works correctly with legacy gear. In > aerospace that can mean equipment from before you and I were born.
And if that gear was not designed to spec you are screwed. You will have to sit down and reverse engineer the unit so you can design the interface. Do you really expect MS to do that with all the crappy software that was designed poorly?
>> These developers are designing crappy software and blaming it on MS. >> > > I've underlined the important part above. You might remember that there > were operating systems before Win2k and that there was software written > for those.
And the software written for the OS like Win95 are not assured of working with the newer and better made OS. Heck, software written for Win95 didn't work with Win95 half the time. That was what was wrong with Win95, it didn't do a good enough job of protecting your computer from the crappy software. You seem to think everything is as simple as your bicycle. OS development is continuing. There are significant problems with older OS and they are trying to fix those problems. If you want to run DOS software, why not run DOS? I have read here that it is still available and the hardware should still run it.
>>>>> I do have to admit it was handy as the files were easy to >>>>> find and save. The new and improved versions of Windoze hide these >>>>> config files in either the registry, or bury them 5 directory layers >>>>> deep, where few can find them without specialized tools or inside >>>>> information. >>>> >>>> Windows doesn't put anything from an app in the registry. That is up to >>>> the app to decide. Getting to these directories is easy if they used >>>> the right location, C:\ProgramData. Instead they continue to use >>>> C:\Program Files and now with Win8 MS puts the files in the long path >>>> name you list, but I believe they can be reached transparently through >>>> the path C:\Program Files So the best of both worlds. >>>> >>>> If the app puts them somewhere else, don't blame windows. >>>> >>> >>> If it was allowed in old Windows, isn't in new Windows, and there isn't >>> a user selector about this then I blame Windows. >> >> "Allowed" meaning it didn't crap out, yes. "Allowed" meaning the >> developers were not designing according to best practices, no. >> > > If it was not disallowed it was ok.
See, that is the BS that got you into the problem. Now you are trying to justify the bad development practices. I surely hope you don't use that philosophy in the stuff you design. If it works, it is ok, ship it! Then someone changes a process a bit and the design stops working.
> Even today it's still this way. > Personally I also think it was wrong but it is what it is. Many CAD > programs still store their libraries in the program folder and, > naturally, libraries are meant to be modified and added to.
Then put that on the CAD designers, not MS. They told them not to do it with W2k and XP, they make it hard to do with Vista and 7. Now with Win8 they have found a way to fake it out and put the files somewhere else. They are just trying to make the computer harder to hack, but no one want to work with them.
>>>>>> I can't tell you how many developers do all sorts of >>>>>> things they aren't supposed to under windows. That is the actual >>>>>> cause >>>>>> of many problems people have running older software under Windows. >>>>>> They >>>>>> don't listen to the people providing them with the OS! >>>>> >>>>> LTspice (aka SwitcherCAD) is a rather old program, with many of the >>>>> traditions of Windoze 3.1 still present. If you don't like that, try >>>>> running some of the various NEC antenna modeling programs, that still >>>>> use the terms "card" and "deck" from the Hollerith punch card era. The >>>>> common mantra is the same everywhere... if it works, don't touch it. >>>> >>>> These programs have been updated many, many times since Windows 3.1. >>>> Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the >>>> 3.1 tree which was ended when XP was released. Stick with the old >>>> habits and blame yourself or your program maintainer. >>>> >>>> I use some open source Windows software that does the same crap and I am >>>> very vocal about the cause and the fix for the problem. Few of the >>>> developers are interested though. Now that 8 makes this (using Program >>>> Files for data) work adequately they no longer have a need to change it. >>>> >>>> If you are relying on programming habits from over 20 years ago, then >>>> you will have to stew in your own soup. >>>> >>> >>> Easy to say for someone who probably never has to deal with beamfield >>> sims and such. Bottomline there are programs some of us have to use >>> where there is no alternative. Where the design teams have dissolved >>> decades ago and some of the folks are not with us on earth anymore. My >>> record so far is a chunk of software that was stored on an 8" floppy. >> >> Yeah, exactly. If you are that far back in time you man need to rethink >> your approach. >> > > Any suggestions there? Should I tell my clients that this, that and the > other project can't be done because only older design tools are available?
Yes, tell your clients that there has been no software written since 1975.
> Imagine you having a leak in the house. The plumber comes, takes a look > and says "That wouldn't be up to code these days although it was back > then. I suggest you build a new house and have this one torn down".
If you think that is at all analogous then you deserve the problems you are having. I suppose you are still driving the car you had in the 70's too?
>>> Software does not automatically lose its value because it is over 20 >>> years old. Or would you pour a bottle of 1995 Domaine Leflaive >>> Montrachet Grand Cru [*] into the sink because it is old? >> >> Actually software does degrade with time as you are finding out. If you >> can't find a platform to run it on, it has worn out. >> > > My software does not run out. My hardware does. I get more and more into > pulse-echo stuff and esoteric switch mode designs where the amount of > data to be crunched overwhelms the machine. > > BTW, when the in-circuit tester at one client conked out the culprit was > the PC. Had an ISA bus. It was no problem to buy a brand-new machine at > a very reasonable price that has an ISA bus. Except now they also have a > CD drive in it. > >> >>> Talking about using legacy stuff, the aircraft guys are a bit more >>> extreme there. This aircraft is going to celebrate its 80th soon and is >>> used commercially: >>> >>> https://www.youtube.com/watch?v=jx11k1r1Pm8 >>> >>> [*] It runs north of $5k. Per bottle. >> >> Good, maybe your beamfield sim will run on it. :) >> > > :-) > > Example from a few years ago: A nasty alarm system problem had to be > diagnosed. The software from that system was from the 80's. If I hadn't > been able to run it really old software here at my lab I would have had > to turn down that whole job. That would not be what I call smart.
I guess you will have to close down shop in a few more years then. Even Win7 is going bye-bye before too long. You can learn to use computers or be a victim, your choice. -- Rick
On 11/3/2014 6:59 PM, Joerg wrote:
> rickman wrote: >> On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote: >>> Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman: >>>> On 11/3/2014 4:28 PM, Joerg wrote: >>>>> rickman wrote: >>>>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>>>> DecadentLinuxUserNumeroUno wrote: >>>>> >>>>> [...] >>>>> >>>>>>>> Since I cannot afford to put $1000 into a Titan video card, >>>>>>>> I miss on >>>>>>>> a few benchmarks with my $250 GTX650. >>>>>>> >>>>>>> >>>>>>> I am not at all concerned about video because that's just used for >>>>>>> static display and sometimes video conferencing. No games, no movies. >>>>>> >>>>>> If you are going for power, you need to have separate video memory or >>>>>> the video eats memory bandwidth which is often the limiting factor >>>>>> on a >>>>>> multicore machine. >>>>>> >>>>>> I haven't kept up with the hotrod machines these days, but I'd be >>>>>> willing to bet you will get a lot better performance with multi-banked >>>>>> RAM. Does this machine have two or more memory interfaces or just >>>>>> one? >>>>>> >>>>> >>>>> No clue. But with SPICE the graphics action is very slow, just a wee >>>>> progress of a few traces on an otherwise static screen. And you could >>>>> even turn that off. >>>> >>>> You aren't grasping the concept. Video memory needs a sizable bandwidth >>>> to *display* the image to the screen. All the data that goes out over >>>> your HDMI cable is being read from memory *all the time*. You're a >>>> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >>>> >>> >>> with anything but a graphics-card integrated in the chipset that memory >>> will be on the card itself, 1920*1080*24bit is less that 7MB >> >> It is not the amount of memory, it is the video bandwidth to keep the >> monitor refreshed. Yes, it should be separate from the main memory or >> you take a hit from the video accesses. >> > > It _is_ separate from the main memory on all modern graphics cards. The > core circuitry of a PC has nothing to do with screen refresh. That was > even the case with an old Tseng Labs card I had in the early 90's.
Ok, now I understand why you couldn't get what I am saying. Computers have come a long way since the 90's. Most computers, desktop as well as laptop now integrate the video controller into the main chipset and use main memory as video RAM, *NOT* as a separate function with its own memory. If you don't believe me look at the specs on a few systems. Anything that talks about Intel XYZ graphics has an integrated controller and shares main memory for video. In fact, you said something about this yourself in this thread where you mentioned video on the motherboard I believe. Video on the motherboard is usually integrated. If you get a graphics card it will be separate. A very few motherboards have separate video controller on board with separate video memory. -- Rick
On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote:
> Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman: >> On 11/3/2014 5:47 PM, Joerg wrote: >>> rickman wrote: >>>> On 11/3/2014 4:28 PM, Joerg wrote: >>>>> rickman wrote: >>>>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>>>> DecadentLinuxUserNumeroUno wrote: >>>>> >>>>> [...] >>>>> >>>>>>>> Since I cannot afford to put $1000 into a Titan video card, I >>>>>>>> miss on >>>>>>>> a few benchmarks with my $250 GTX650. >>>>>>> >>>>>>> >>>>>>> I am not at all concerned about video because that's just used for >>>>>>> static display and sometimes video conferencing. No games, no movies. >>>>>> >>>>>> If you are going for power, you need to have separate video memory or >>>>>> the video eats memory bandwidth which is often the limiting factor on a >>>>>> multicore machine. >>>>>> >>>>>> I haven't kept up with the hotrod machines these days, but I'd be >>>>>> willing to bet you will get a lot better performance with multi-banked >>>>>> RAM. Does this machine have two or more memory interfaces or just one? >>>>>> >>>>> >>>>> No clue. But with SPICE the graphics action is very slow, just a wee >>>>> progress of a few traces on an otherwise static screen. And you could >>>>> even turn that off. >>>> >>>> You aren't grasping the concept. Video memory needs a sizable bandwidth >>>> to *display* the image to the screen. All the data that goes out over >>>> your HDMI cable is being read from memory *all the time*. You're a >>>> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >>>> >>>> This has *nothing* to do with drawing the images into graphic memory. >>>> >>>> The memory bank question will likely be more important than the number >>>> of cores in the CPU. The guy who can run 16 threads has at least two >>>> memory interfaces or it would be bogging down between 4 and 8 cores. >>>> >>> >>> Well ... we did enter the 21st century. In this day and age graphics >>> cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on >>> board RAM. Others have more but that sounds sufficient. Also, there is >>> no need to store 60 frames if the content is more or less static. >> >> I don't know if you are playing with me or what. Yes, that is what I am >> telling you, get a system with separate graphic memory which means a >> separate graphics chip. Many mobos have built in video with *no* video >> ram. >> > > even so, unless you play 3D games that need massive amount of texture memory > I doubt it matters much > > an I7 have something like +30GByte/sec memory BW depending on memory config > > refreshing two full HD monitors at 60Hz is only a few percent of that
Yes, exactly! Just refreshing the monitors is some significant percentage of the available memory bandwidth. Why spend a bunch of money on an i7 with fast memory only to share that with the video controller? Running multicore is typically memory bandwidth limited so a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU performance in the critical sections of code... where it matters. -- Rick
On 11/3/2014 7:37 PM, rickman wrote:
> On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote: >> Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman: >>> On 11/3/2014 5:47 PM, Joerg wrote: >>>> rickman wrote: >>>>> On 11/3/2014 4:28 PM, Joerg wrote: >>>>>> rickman wrote: >>>>>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>>>>> DecadentLinuxUserNumeroUno wrote: >>>>>> >>>>>> [...] >>>>>> >>>>>>>>> Since I cannot afford to put $1000 into a Titan video >>>>>>>>> card, I >>>>>>>>> miss on >>>>>>>>> a few benchmarks with my $250 GTX650. >>>>>>>> >>>>>>>> >>>>>>>> I am not at all concerned about video because that's just used for >>>>>>>> static display and sometimes video conferencing. No games, no >>>>>>>> movies. >>>>>>> >>>>>>> If you are going for power, you need to have separate video >>>>>>> memory or >>>>>>> the video eats memory bandwidth which is often the limiting >>>>>>> factor on a >>>>>>> multicore machine. >>>>>>> >>>>>>> I haven't kept up with the hotrod machines these days, but I'd be >>>>>>> willing to bet you will get a lot better performance with >>>>>>> multi-banked >>>>>>> RAM. Does this machine have two or more memory interfaces or >>>>>>> just one? >>>>>>> >>>>>> >>>>>> No clue. But with SPICE the graphics action is very slow, just a wee >>>>>> progress of a few traces on an otherwise static screen. And you could >>>>>> even turn that off. >>>>> >>>>> You aren't grasping the concept. Video memory needs a sizable >>>>> bandwidth >>>>> to *display* the image to the screen. All the data that goes out over >>>>> your HDMI cable is being read from memory *all the time*. You're a >>>>> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >>>>> >>>>> This has *nothing* to do with drawing the images into graphic memory. >>>>> >>>>> The memory bank question will likely be more important than the number >>>>> of cores in the CPU. The guy who can run 16 threads has at least two >>>>> memory interfaces or it would be bogging down between 4 and 8 cores. >>>>> >>>> >>>> Well ... we did enter the 21st century. In this day and age graphics >>>> cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on >>>> board RAM. Others have more but that sounds sufficient. Also, there is >>>> no need to store 60 frames if the content is more or less static. >>> >>> I don't know if you are playing with me or what. Yes, that is what I am >>> telling you, get a system with separate graphic memory which means a >>> separate graphics chip. Many mobos have built in video with *no* video >>> ram. >>> >> >> even so, unless you play 3D games that need massive amount of texture >> memory >> I doubt it matters much >> >> an I7 have something like +30GByte/sec memory BW depending on memory >> config >> >> refreshing two full HD monitors at 60Hz is only a few percent of that > > Yes, exactly! Just refreshing the monitors is some significant > percentage of the available memory bandwidth. Why spend a bunch of > money on an i7 with fast memory only to share that with the video > controller? Running multicore is typically memory bandwidth limited so > a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU > performance in the critical sections of code... where it matters.
That's also why I asked Joerg if this machine had dual memory channels. That makes a big difference running multicore. I would expect that to show up in the promotional material somewhere. -- Rick
On 11/3/2014 6:49 PM, Phil Hobbs wrote:
> On 11/3/2014 4:31 PM, Joerg wrote: >> Phil Hobbs wrote: >>> On 11/02/2014 01:17 PM, Phil Hobbs wrote: >>>> On 11/2/2014 12:45 PM, John Larkin wrote: >>>>> On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs >>>>> <hobbs@electrooptical.net> wrote: >>>>> >>>>>> On 11/2/2014 11:00 AM, John Larkin wrote: >>>>>>> On Sun, 02 Nov 2014 07:25:49 -0800, Joerg >>>>>>> <news@analogconsultants.com> >>>>>>> wrote: >>>>>>> >>>>>>>> Folks, >>>>>>>> >>>>>>>> Need to spiff up my simulation speeds here. IIRC Mike Engelhardt >>>>>>>> stated >>>>>>>> that the Intel i7 is a really good processor for LTSPice. >>>>>>>> According to >>>>>>>> this it looks like the 4790 is the fastest of the bunch: >>>>>>>> >>>>>>>> http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> So, what do thee say, is the computer in the Costco link below a >>>>>>>> good >>>>>>>> deal for LTSpice purposes? >>>>>>>> >>>>>>>> http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> It's also available without MS-Office Home & Student 2013 for $100 >>>>>>>> less >>>>>>>> but I found that OpenOffice isn't 100% compatible in the Excel >>>>>>>> area so >>>>>>>> that sounds like an ok deal. My hope is that it can drive two 27" >>>>>>>> monitors but I guess I can always add in another graphics card if >>>>>>>> not. >>>>>>>> >>>>>>>> Reason I am looking at these is that I absolutely positively do not >>>>>>>> want >>>>>>>> any computer with Windows 8 in here and unfortunately that's what >>>>>>>> many >>>>>>>> others come with. >>>>>>> >>>>>>> I have spent too many hours this weekend tweaking the transient >>>>>>> response of a semi-hysteretic (we call it "hysterical") switchmode >>>>>>> constant-current source. There are about 8 interacting knobs to >>>>>>> turn. >>>>>>> At 30 seconds per run, understanding the interactions is impossible. >>>>>>> >>>>>>> I want sliders on each of the part values, and I want to see the >>>>>>> waveforms change as I move the sliders, like they were trimpots on a >>>>>>> breadboard and I was looking at a scope. I need maybe 500 times the >>>>>>> compute power that I have now. >>>>>>> >>>>>>> Mike should code LT Spice to execute on a high-end video card. >>>>>>> >>>>>>> >>>>>> >>>>>> You can go quite a bit faster with a nice multicore machine--LTspice >>>>>> lets you choose how many threads to run. My desktop machine (about 3 >>>>>> years old now) runs about 150 Gflops peak. Supermicro is an >>>>>> excellent >>>>>> vendor. >>>>>> >>>>>> Cheers >>>>>> >>>>>> Phil Hobbs >>>>> >>>>> There's a setting for one or two threads. Is that all? >>>>> >>>>> >>>> That's because you only have two cores. Mine goes up to 15. >>> >>> 16 actually. Here's a picture: >>> http://electrooptical.net/pictures/LTspice16threads.png >>> >> >> That sounds like a high-testosterone machine of a computer :-) >> >> Which processors is in there? >> > > It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3 > years ago the Magny Cours Opterons ran rings around the Intel offerings > for floating point. > > Cheers > > Phil Hobbs >
The exact specs are: 1 pc ACC-3C0-3C13685 Supermicro Chassis 733TQ 1 pc ACC-3C0-3C13685 SUPERMICRO H8DGI-F 2 pcs CFN-OTH-AC170 2 OPTERON COOLING FAN 2 pcs AMD OPTERON 6128 8-CORE 16 TOTAL CORES 8 pcs MM3-KIN-4G133ER KINGSTON 4GB DDR3 ECC REGISTERED CL9 1.35-1.5V (32gb of ram installed) 4 pcs HDA-WDC-WD1002F WDC RE4 1TB CDW-LGE-22XSATA 1 pc GOLDSTAR DVDRW 22X GH22NS30 It runs CentOS 6 Linux, with four Windows VMs under Qemu/KVM: two XP and two Win7. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
Phil Hobbs wrote:
> On 11/3/2014 4:31 PM, Joerg wrote: >> Phil Hobbs wrote: >>> On 11/02/2014 01:17 PM, Phil Hobbs wrote: >>>> On 11/2/2014 12:45 PM, John Larkin wrote: >>>>> On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs >>>>> <hobbs@electrooptical.net> wrote: >>>>> >>>>>> On 11/2/2014 11:00 AM, John Larkin wrote: >>>>>>> On Sun, 02 Nov 2014 07:25:49 -0800, Joerg >>>>>>> <news@analogconsultants.com> >>>>>>> wrote: >>>>>>> >>>>>>>> Folks, >>>>>>>> >>>>>>>> Need to spiff up my simulation speeds here. IIRC Mike Engelhardt >>>>>>>> stated >>>>>>>> that the Intel i7 is a really good processor for LTSPice. >>>>>>>> According to >>>>>>>> this it looks like the 4790 is the fastest of the bunch: >>>>>>>> >>>>>>>> http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> So, what do thee say, is the computer in the Costco link below a >>>>>>>> good >>>>>>>> deal for LTSpice purposes? >>>>>>>> >>>>>>>> http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> It's also available without MS-Office Home & Student 2013 for $100 >>>>>>>> less >>>>>>>> but I found that OpenOffice isn't 100% compatible in the Excel >>>>>>>> area so >>>>>>>> that sounds like an ok deal. My hope is that it can drive two 27" >>>>>>>> monitors but I guess I can always add in another graphics card if >>>>>>>> not. >>>>>>>> >>>>>>>> Reason I am looking at these is that I absolutely positively do not >>>>>>>> want >>>>>>>> any computer with Windows 8 in here and unfortunately that's what >>>>>>>> many >>>>>>>> others come with. >>>>>>> >>>>>>> I have spent too many hours this weekend tweaking the transient >>>>>>> response of a semi-hysteretic (we call it "hysterical") switchmode >>>>>>> constant-current source. There are about 8 interacting knobs to >>>>>>> turn. >>>>>>> At 30 seconds per run, understanding the interactions is impossible. >>>>>>> >>>>>>> I want sliders on each of the part values, and I want to see the >>>>>>> waveforms change as I move the sliders, like they were trimpots on a >>>>>>> breadboard and I was looking at a scope. I need maybe 500 times the >>>>>>> compute power that I have now. >>>>>>> >>>>>>> Mike should code LT Spice to execute on a high-end video card. >>>>>>> >>>>>>> >>>>>> >>>>>> You can go quite a bit faster with a nice multicore machine--LTspice >>>>>> lets you choose how many threads to run. My desktop machine (about 3 >>>>>> years old now) runs about 150 Gflops peak. Supermicro is an >>>>>> excellent >>>>>> vendor. >>>>>> >>>>>> Cheers >>>>>> >>>>>> Phil Hobbs >>>>> >>>>> There's a setting for one or two threads. Is that all? >>>>> >>>>> >>>> That's because you only have two cores. Mine goes up to 15. >>> >>> 16 actually. Here's a picture: >>> http://electrooptical.net/pictures/LTspice16threads.png >>> >> >> That sounds like a high-testosterone machine of a computer :-) >> >> Which processors is in there? >> > > It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3 > years ago the Magny Cours Opterons ran rings around the Intel offerings > for floating point. >
Interesting. I had a similar experience when I bought my mil-spec Durabook many years ago. It has an AMD Turion 64. We sat there at a Cypress seminar and we all built and compiled. When I finished the first one and the blinkenlights began on the board the guys around me could not believe the compile speed. Even the super expensive Thinkpads were still thinking. -- Regards, Joerg http://www.analogconsultants.com/