Electronics-Related.com
Forums

Is this Intel i7 machine good for LTSpice?

Started by Joerg November 2, 2014
rickman wrote:
> On 11/2/2014 5:28 PM, Jeff Liebermann wrote: >> On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote: >> >>>>> One catch. LTspice saves its preferences to: >>>>> C:\windows\scad3.ini >>>>> which has to be writeable. The fix is to use the >>>>> -ini <path> >>>>> command line switch, which will: >>>>> Specify an .ini file to use other than %WINDIR%\scad3.ini >>>>> <http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm> >> >>> I need to note this somewhere. Writing to the Windows directory is a >>> *very* bad idea. >> >> It was standard procedure in Windoze 3.1, where almost all >> applications dropped pick_a_name.ini files in the C:\Windows\ >> directory. > > Yes, and Windows 3.1 crashed on a regular basis for about any reason > whatsoever just like 95, 98 and ME. > > MS has been telling developers since Win2000 and maybe since NT to not > put data files in the Windows or Program Files directories. Many chose > to ignore this which wasn't enforced until Vista and became one of the > things everyone loves to hate about Vista. >
Maybe. But for us users only one thing counts: That stuff works.
> >> I do have to admit it was handy as the files were easy to >> find and save. The new and improved versions of Windoze hide these >> config files in either the registry, or bury them 5 directory layers >> deep, where few can find them without specialized tools or inside >> information. > > Windows doesn't put anything from an app in the registry. That is up to > the app to decide. Getting to these directories is easy if they used > the right location, C:\ProgramData. Instead they continue to use > C:\Program Files and now with Win8 MS puts the files in the long path > name you list, but I believe they can be reached transparently through > the path C:\Program Files So the best of both worlds. > > If the app puts them somewhere else, don't blame windows. >
If it was allowed in old Windows, isn't in new Windows, and there isn't a user selector about this then I blame Windows.
> >>> I can't tell you how many developers do all sorts of >>> things they aren't supposed to under windows. That is the actual cause >>> of many problems people have running older software under Windows. They >>> don't listen to the people providing them with the OS! >> >> LTspice (aka SwitcherCAD) is a rather old program, with many of the >> traditions of Windoze 3.1 still present. If you don't like that, try >> running some of the various NEC antenna modeling programs, that still >> use the terms "card" and "deck" from the Hollerith punch card era. The >> common mantra is the same everywhere... if it works, don't touch it. > > These programs have been updated many, many times since Windows 3.1. > Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the > 3.1 tree which was ended when XP was released. Stick with the old > habits and blame yourself or your program maintainer. > > I use some open source Windows software that does the same crap and I am > very vocal about the cause and the fix for the problem. Few of the > developers are interested though. Now that 8 makes this (using Program > Files for data) work adequately they no longer have a need to change it. > > If you are relying on programming habits from over 20 years ago, then > you will have to stew in your own soup. >
Easy to say for someone who probably never has to deal with beamfield sims and such. Bottomline there are programs some of us have to use where there is no alternative. Where the design teams have dissolved decades ago and some of the folks are not with us on earth anymore. My record so far is a chunk of software that was stored on an 8" floppy. Software does not automatically lose its value because it is over 20 years old. Or would you pour a bottle of 1995 Domaine Leflaive Montrachet Grand Cru [*] into the sink because it is old? Talking about using legacy stuff, the aircraft guys are a bit more extreme there. This aircraft is going to celebrate its 80th soon and is used commercially: https://www.youtube.com/watch?v=jx11k1r1Pm8 [*] It runs north of $5k. Per bottle. -- Regards, Joerg http://www.analogconsultants.com/
rickman wrote:
> On 11/3/2014 4:28 PM, Joerg wrote: >> rickman wrote: >>> On 11/3/2014 3:51 PM, Joerg wrote: >>>> DecadentLinuxUserNumeroUno wrote: >> >> [...] >> >>>>> Since I cannot afford to put $1000 into a Titan video card, I >>>>> miss on >>>>> a few benchmarks with my $250 GTX650. >>>> >>>> >>>> I am not at all concerned about video because that's just used for >>>> static display and sometimes video conferencing. No games, no movies. >>> >>> If you are going for power, you need to have separate video memory or >>> the video eats memory bandwidth which is often the limiting factor on a >>> multicore machine. >>> >>> I haven't kept up with the hotrod machines these days, but I'd be >>> willing to bet you will get a lot better performance with multi-banked >>> RAM. Does this machine have two or more memory interfaces or just one? >>> >> >> No clue. But with SPICE the graphics action is very slow, just a wee >> progress of a few traces on an otherwise static screen. And you could >> even turn that off. > > You aren't grasping the concept. Video memory needs a sizable bandwidth > to *display* the image to the screen. All the data that goes out over > your HDMI cable is being read from memory *all the time*. You're a > bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. > > This has *nothing* to do with drawing the images into graphic memory. > > The memory bank question will likely be more important than the number > of cores in the CPU. The guy who can run 16 threads has at least two > memory interfaces or it would be bogging down between 4 and 8 cores. >
Well ... we did enter the 21st century. In this day and age graphics cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on board RAM. Others have more but that sounds sufficient. Also, there is no need to store 60 frames if the content is more or less static. -- Regards, Joerg http://www.analogconsultants.com/
On 03/11/2014 20:37, Joerg wrote:

> Ok, but for example a gamer machine like the XPS series is a pretty good > bet that it'll perform well with SPICE.
If you can persuade them to do it a gamer machine with the graphics card entirely deleted will be the cheapest low power combo to do about what you want. The 2D capability of the Intel graphics engine internal on the i5 & i7 CPUs are as fast as anything on fancy 3D gaming cards. (obviously they get totally thrashed in 3D realtime rendering tests) Basically you can shave 100-200W of the power consumption. My i7 PC idles at about 60W when it isn't doing anything beyond web browsing. BTW I wouldn't waste your money on exotic faster ram unless you intend to overclock it. Stock ram and more of it is a better price performance. I'd be interested to see how a moderate sized LTSPice simulation scales with the number of threads on an i5 and i7 architecture. My guess is that hyperthreading will not be all that useful to it. -- Regards, Martin Brown
Martin Brown wrote:
> On 03/11/2014 20:37, Joerg wrote: > >> Ok, but for example a gamer machine like the XPS series is a pretty good >> bet that it'll perform well with SPICE. > > If you can persuade them to do it a gamer machine with the graphics card > entirely deleted will be the cheapest low power combo to do about what > you want. The 2D capability of the Intel graphics engine internal on the > i5 & i7 CPUs are as fast as anything on fancy 3D gaming cards. > (obviously they get totally thrashed in 3D realtime rendering tests) > > Basically you can shave 100-200W of the power consumption. My i7 PC > idles at about 60W when it isn't doing anything beyond web browsing. >
Problem is, unless you piece together a custom machine the ones that are equipped with good processors and RAM up to the gills seem to always come with these powerful graphics cards. The other issue is that on-board graphics often will not drive two monitors. I found that out the hard way after I bought the PC I am using now.
> BTW I wouldn't waste your money on exotic faster ram unless you intend > to overclock it. Stock ram and more of it is a better price performance. >
So you think 1600MHz RAM is fine?
> I'd be interested to see how a moderate sized LTSPice simulation scales > with the number of threads on an i5 and i7 architecture. My guess is > that hyperthreading will not be all that useful to it. >
I am hoping the four cores will speed things up significantly. Also the huge amount of RAM. Right now I have 2GB and I regularly hit the limit. -- Regards, Joerg http://www.analogconsultants.com/
In article <01a8a0d7-a080-4ff4-ae7d-961154a235b8@googlegroups.com>, 
langwadt@fonz.dk says...
> > Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman: > > On 11/3/2014 4:28 PM, Joerg wrote: > > > rickman wrote: > > >> On 11/3/2014 3:51 PM, Joerg wrote: > > >>> DecadentLinuxUserNumeroUno wrote: > > > > > > [...] > > > > > >>>> Since I cannot afford to put $1000 into a Titan video card, I miss on > > >>>> a few benchmarks with my $250 GTX650. > > >>> > > >>> > > >>> I am not at all concerned about video because that's just used for > > >>> static display and sometimes video conferencing. No games, no movies. > > >> > > >> If you are going for power, you need to have separate video memory or > > >> the video eats memory bandwidth which is often the limiting factor on a > > >> multicore machine. > > >> > > >> I haven't kept up with the hotrod machines these days, but I'd be > > >> willing to bet you will get a lot better performance with multi-banked > > >> RAM. Does this machine have two or more memory interfaces or just one? > > >> > > > > > > No clue. But with SPICE the graphics action is very slow, just a wee > > > progress of a few traces on an otherwise static screen. And you could > > > even turn that off. > > > > You aren't grasping the concept. Video memory needs a sizable bandwidth > > to *display* the image to the screen. All the data that goes out over > > your HDMI cable is being read from memory *all the time*. You're a > > bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. > > > > with anything but a graphics-card integrated in the chipset that memory > will be on the card itself, 1920*1080*24bit is less that 7MB > > -Lasse
how do you figure that ? As far as I know, most video cards map a whole 32 bit per pixel for 24bit (true color) these days. That would equate to somewhere around 67 megs I do think. I don't know, maybe there is some compression magic I don't know about.. Jamie
On 11/3/2014 5:09 PM, Joerg wrote:
> rickman wrote: >> On 11/2/2014 5:28 PM, Jeff Liebermann wrote: >>> On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote: >>> >>>>>> One catch. LTspice saves its preferences to: >>>>>> C:\windows\scad3.ini >>>>>> which has to be writeable. The fix is to use the >>>>>> -ini <path> >>>>>> command line switch, which will: >>>>>> Specify an .ini file to use other than %WINDIR%\scad3.ini >>>>>> <http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm> >>> >>>> I need to note this somewhere. Writing to the Windows directory is a >>>> *very* bad idea. >>> >>> It was standard procedure in Windoze 3.1, where almost all >>> applications dropped pick_a_name.ini files in the C:\Windows\ >>> directory. >> >> Yes, and Windows 3.1 crashed on a regular basis for about any reason >> whatsoever just like 95, 98 and ME. >> >> MS has been telling developers since Win2000 and maybe since NT to not >> put data files in the Windows or Program Files directories. Many chose >> to ignore this which wasn't enforced until Vista and became one of the >> things everyone loves to hate about Vista. >> > > Maybe. But for us users only one thing counts: That stuff works.
Do you build your stuff so that if the user connects a different computer it craps out? No, you design your interfaces *correctly* so that it works now and it keeps working when some peripheral piece that should have no impact is changed out. These developers are designing crappy software and blaming it on MS.
>>> I do have to admit it was handy as the files were easy to >>> find and save. The new and improved versions of Windoze hide these >>> config files in either the registry, or bury them 5 directory layers >>> deep, where few can find them without specialized tools or inside >>> information. >> >> Windows doesn't put anything from an app in the registry. That is up to >> the app to decide. Getting to these directories is easy if they used >> the right location, C:\ProgramData. Instead they continue to use >> C:\Program Files and now with Win8 MS puts the files in the long path >> name you list, but I believe they can be reached transparently through >> the path C:\Program Files So the best of both worlds. >> >> If the app puts them somewhere else, don't blame windows. >> > > If it was allowed in old Windows, isn't in new Windows, and there isn't > a user selector about this then I blame Windows.
"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the developers were not designing according to best practices, no.
>>>> I can't tell you how many developers do all sorts of >>>> things they aren't supposed to under windows. That is the actual cause >>>> of many problems people have running older software under Windows. They >>>> don't listen to the people providing them with the OS! >>> >>> LTspice (aka SwitcherCAD) is a rather old program, with many of the >>> traditions of Windoze 3.1 still present. If you don't like that, try >>> running some of the various NEC antenna modeling programs, that still >>> use the terms "card" and "deck" from the Hollerith punch card era. The >>> common mantra is the same everywhere... if it works, don't touch it. >> >> These programs have been updated many, many times since Windows 3.1. >> Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the >> 3.1 tree which was ended when XP was released. Stick with the old >> habits and blame yourself or your program maintainer. >> >> I use some open source Windows software that does the same crap and I am >> very vocal about the cause and the fix for the problem. Few of the >> developers are interested though. Now that 8 makes this (using Program >> Files for data) work adequately they no longer have a need to change it. >> >> If you are relying on programming habits from over 20 years ago, then >> you will have to stew in your own soup. >> > > Easy to say for someone who probably never has to deal with beamfield > sims and such. Bottomline there are programs some of us have to use > where there is no alternative. Where the design teams have dissolved > decades ago and some of the folks are not with us on earth anymore. My > record so far is a chunk of software that was stored on an 8" floppy.
Yeah, exactly. If you are that far back in time you man need to rethink your approach.
> Software does not automatically lose its value because it is over 20 > years old. Or would you pour a bottle of 1995 Domaine Leflaive > Montrachet Grand Cru [*] into the sink because it is old?
Actually software does degrade with time as you are finding out. If you can't find a platform to run it on, it has worn out.
> Talking about using legacy stuff, the aircraft guys are a bit more > extreme there. This aircraft is going to celebrate its 80th soon and is > used commercially: > > https://www.youtube.com/watch?v=jx11k1r1Pm8 > > [*] It runs north of $5k. Per bottle.
Good, maybe your beamfield sim will run on it. :) -- Rick
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
> Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman: >> On 11/3/2014 4:28 PM, Joerg wrote: >>> rickman wrote: >>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>> DecadentLinuxUserNumeroUno wrote: >>> >>> [...] >>> >>>>>> Since I cannot afford to put $1000 into a Titan video card, I miss on >>>>>> a few benchmarks with my $250 GTX650. >>>>> >>>>> >>>>> I am not at all concerned about video because that's just used for >>>>> static display and sometimes video conferencing. No games, no movies. >>>> >>>> If you are going for power, you need to have separate video memory or >>>> the video eats memory bandwidth which is often the limiting factor on a >>>> multicore machine. >>>> >>>> I haven't kept up with the hotrod machines these days, but I'd be >>>> willing to bet you will get a lot better performance with multi-banked >>>> RAM. Does this machine have two or more memory interfaces or just one? >>>> >>> >>> No clue. But with SPICE the graphics action is very slow, just a wee >>> progress of a few traces on an otherwise static screen. And you could >>> even turn that off. >> >> You aren't grasping the concept. Video memory needs a sizable bandwidth >> to *display* the image to the screen. All the data that goes out over >> your HDMI cable is being read from memory *all the time*. You're a >> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >> > > with anything but a graphics-card integrated in the chipset that memory > will be on the card itself, 1920*1080*24bit is less that 7MB
It is not the amount of memory, it is the video bandwidth to keep the monitor refreshed. Yes, it should be separate from the main memory or you take a hit from the video accesses. -- Rick
Maynard A. Philbrook Jr. wrote:
> In article <01a8a0d7-a080-4ff4-ae7d-961154a235b8@googlegroups.com>, > langwadt@fonz.dk says... >> Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman: >>> On 11/3/2014 4:28 PM, Joerg wrote: >>>> rickman wrote: >>>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>>> DecadentLinuxUserNumeroUno wrote: >>>> [...] >>>> >>>>>>> Since I cannot afford to put $1000 into a Titan video card, I miss on >>>>>>> a few benchmarks with my $250 GTX650. >>>>>> >>>>>> I am not at all concerned about video because that's just used for >>>>>> static display and sometimes video conferencing. No games, no movies. >>>>> If you are going for power, you need to have separate video memory or >>>>> the video eats memory bandwidth which is often the limiting factor on a >>>>> multicore machine. >>>>> >>>>> I haven't kept up with the hotrod machines these days, but I'd be >>>>> willing to bet you will get a lot better performance with multi-banked >>>>> RAM. Does this machine have two or more memory interfaces or just one? >>>>> >>>> No clue. But with SPICE the graphics action is very slow, just a wee >>>> progress of a few traces on an otherwise static screen. And you could >>>> even turn that off. >>> You aren't grasping the concept. Video memory needs a sizable bandwidth >>> to *display* the image to the screen. All the data that goes out over >>> your HDMI cable is being read from memory *all the time*. You're a >>> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >>> >> with anything but a graphics-card integrated in the chipset that memory >> will be on the card itself, 1920*1080*24bit is less that 7MB >> >> -Lasse > > how do you figure that ? > > As far as I know, most video cards map a whole 32 bit per pixel for > 24bit (true color) these days. > > That would equate to somewhere around 67 megs I do think. >
That's megabits. In bytes this would be 8.3MB. Times two if there's two displays. With the massive quantity of onboard memory on modern graphics cards that is a mere drop in the bucket.
> I don't know, maybe there is some compression magic I don't know > about.. >
No need for compression. -- Regards, Joerg http://www.analogconsultants.com/
On 11/3/2014 5:47 PM, Joerg wrote:
> rickman wrote: >> On 11/3/2014 4:28 PM, Joerg wrote: >>> rickman wrote: >>>> On 11/3/2014 3:51 PM, Joerg wrote: >>>>> DecadentLinuxUserNumeroUno wrote: >>> >>> [...] >>> >>>>>> Since I cannot afford to put $1000 into a Titan video card, I >>>>>> miss on >>>>>> a few benchmarks with my $250 GTX650. >>>>> >>>>> >>>>> I am not at all concerned about video because that's just used for >>>>> static display and sometimes video conferencing. No games, no movies. >>>> >>>> If you are going for power, you need to have separate video memory or >>>> the video eats memory bandwidth which is often the limiting factor on a >>>> multicore machine. >>>> >>>> I haven't kept up with the hotrod machines these days, but I'd be >>>> willing to bet you will get a lot better performance with multi-banked >>>> RAM. Does this machine have two or more memory interfaces or just one? >>>> >>> >>> No clue. But with SPICE the graphics action is very slow, just a wee >>> progress of a few traces on an otherwise static screen. And you could >>> even turn that off. >> >> You aren't grasping the concept. Video memory needs a sizable bandwidth >> to *display* the image to the screen. All the data that goes out over >> your HDMI cable is being read from memory *all the time*. You're a >> bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel. >> >> This has *nothing* to do with drawing the images into graphic memory. >> >> The memory bank question will likely be more important than the number >> of cores in the CPU. The guy who can run 16 threads has at least two >> memory interfaces or it would be bogging down between 4 and 8 cores. >> > > Well ... we did enter the 21st century. In this day and age graphics > cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on > board RAM. Others have more but that sounds sufficient. Also, there is > no need to store 60 frames if the content is more or less static.
I don't know if you are playing with me or what. Yes, that is what I am telling you, get a system with separate graphic memory which means a separate graphics chip. Many mobos have built in video with *no* video ram. -- Rick
On 11/3/2014 4:31 PM, Joerg wrote:
> Phil Hobbs wrote: >> On 11/02/2014 01:17 PM, Phil Hobbs wrote: >>> On 11/2/2014 12:45 PM, John Larkin wrote: >>>> On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs >>>> <hobbs@electrooptical.net> wrote: >>>> >>>>> On 11/2/2014 11:00 AM, John Larkin wrote: >>>>>> On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com> >>>>>> wrote: >>>>>> >>>>>>> Folks, >>>>>>> >>>>>>> Need to spiff up my simulation speeds here. IIRC Mike Engelhardt >>>>>>> stated >>>>>>> that the Intel i7 is a really good processor for LTSPice. >>>>>>> According to >>>>>>> this it looks like the 4790 is the fastest of the bunch: >>>>>>> >>>>>>> http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html >>>>>>> >>>>>>> >>>>>>> >>>>>>> So, what do thee say, is the computer in the Costco link below a good >>>>>>> deal for LTSpice purposes? >>>>>>> >>>>>>> http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html >>>>>>> >>>>>>> >>>>>>> >>>>>>> It's also available without MS-Office Home & Student 2013 for $100 >>>>>>> less >>>>>>> but I found that OpenOffice isn't 100% compatible in the Excel >>>>>>> area so >>>>>>> that sounds like an ok deal. My hope is that it can drive two 27" >>>>>>> monitors but I guess I can always add in another graphics card if >>>>>>> not. >>>>>>> >>>>>>> Reason I am looking at these is that I absolutely positively do not >>>>>>> want >>>>>>> any computer with Windows 8 in here and unfortunately that's what >>>>>>> many >>>>>>> others come with. >>>>>> >>>>>> I have spent too many hours this weekend tweaking the transient >>>>>> response of a semi-hysteretic (we call it "hysterical") switchmode >>>>>> constant-current source. There are about 8 interacting knobs to turn. >>>>>> At 30 seconds per run, understanding the interactions is impossible. >>>>>> >>>>>> I want sliders on each of the part values, and I want to see the >>>>>> waveforms change as I move the sliders, like they were trimpots on a >>>>>> breadboard and I was looking at a scope. I need maybe 500 times the >>>>>> compute power that I have now. >>>>>> >>>>>> Mike should code LT Spice to execute on a high-end video card. >>>>>> >>>>>> >>>>> >>>>> You can go quite a bit faster with a nice multicore machine--LTspice >>>>> lets you choose how many threads to run. My desktop machine (about 3 >>>>> years old now) runs about 150 Gflops peak. Supermicro is an excellent >>>>> vendor. >>>>> >>>>> Cheers >>>>> >>>>> Phil Hobbs >>>> >>>> There's a setting for one or two threads. Is that all? >>>> >>>> >>> That's because you only have two cores. Mine goes up to 15. >> >> 16 actually. Here's a picture: >> http://electrooptical.net/pictures/LTspice16threads.png >> > > That sounds like a high-testosterone machine of a computer :-) > > Which processors is in there? >
It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3 years ago the Magny Cours Opterons ran rings around the Intel offerings for floating point. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net