software engineering

ihateibuypower

Distinguished
Feb 5, 2006
84
0
18,630
hey guys...i wanna get into software engineering as i am hoping to study it in college...wondering if you had any suggestions on books or tutorials that could help me get started. Also, i am seriously considering buying a macbook pro and using the parallels program to go on windows when i do this stuff. does anybody know if that works well or if another laptop could be a good choice for me. all your help is appreciated
 
hey guys...i wanna get into software engineering as i am hoping to study it in college...wondering if you had any suggestions on books or tutorials that could help me get started.

I learned how to code simply because I had to. Some of my classes required that I do some programming and I learned pretty much on-the-fly. I do have a manual, Deitel and Deitel's How To Program C, Fourth Edition. But once I got through the basics, the manual is a reference, as is the GNU website as I code on a Linux machine.

Also, i am seriously considering buying a macbook pro and using the parallels program to go on windows when i do this stuff. does anybody know if that works well or if another laptop could be a good choice for me. all your help is appreciated

Why in the world would you want to buy a more-expensive MacBook Pro to just pay yet more money to buy a full copy of Windows to run it in emulator at reduced performance? If you want to program in Windows, get yourself a laptop that has Windows already installed. Save yourself a bunch of money.

But depending on what kind of programming you intend to do, you'll probably want to have access to some sort or UNIXy machine to do C and possibly no-GUI C++ and Java on. That is what the classes at my university did- the guys who ran Linux, BSD, Solaris, or MacOS X (with the GNU toolchain installed, it's not be default) could just program on their own computer rather than having to ssh into a server and deal with either SFTPing all of your files to the local directory on the server after every save or using a CLI text editor in the ssh session. Needless to say, many people at least dual-booted one of those OSes if they didn't run it outright and leave Windows behind.
 

chadsxe

Distinguished
Aug 14, 2007
111
0
18,680
I graduate with my degree in software engineering at the end of this month. Clap Clap Clap :) Anyways, hands down everything you need to learn can be learned online. No questions asked. I have about 30 reference books that I have collected over the years and I rarely touch them. The majority of them are poorly formatted for anyone who is just starting out. Especially considering the multitude of community forums out there.

I agree with MU_Engineer that there is no reasoning on getting a Macbook. I would 100% recommend a notebook running Windows native. As much as people liked to beat up on Microsoft Visual Studios (I am talking < 2003) it has proven to be just about the best IDE on the market. In addition its c++ compiler is just as accurate in regards to ANSII standards if not more as the GCC GNU compiler and it is hard to beat the backing of the power house know as Microsoft.

Very important to remember Software Engineering is by no means %100 percent coding. There is a lot of design theory and development approaches to be learned. Also, C++ is by in large the most commonly used language. To me learning C is almost to the point of being counterproductive. But that can be argued to high hell. :)

Regards

Chad
 

wolverinero79

Distinguished
Jul 11, 2001
1,127
0
19,280
Ok, if you're going to get into an industry built on knowing how to program and knowing the internals of a computer, why would you buy the fischer price model? Apples are fantastic - if you surf the internet, write e-mail, and don't really know what a CPU is. If you play games, program, etc., you'll want a Windows based machine.

Yes, most colleges will probably run Unix, since colleges are typically ultra-liberal, hate big business (so Microsoft, in this case), plus Linux is cheap. However, guess how many businesses use Linux instead of Windows...ya. If you're learning c++/java/Oracle on a Linux system, but you're also learning .NET/SQL Server on your own or through classes as well on a Windows system, you'll be in high demand after school. If you limit yourself to the *nix world and you don't have a Windows based machine to at least play with, you'll find a lot of doors closed in your face.

Plus, it's college. Unless you're taking a nice TV with your 360 or Wii, you need that Windows system for games :-D
 

FITCamaro

Distinguished
Feb 28, 2006
700
0
18,990
Get one with a widescreen display. And others have said, you'll likely want a Windows notebook. You won't do anything that you'll need OSX for in college. And most college's run Windows, not Linux. Professors might prefer Linux, but the university themselves will likely use Windows. I went to Florida Institute of Technology in Melbourne, FL (good, little known school) and they actually switched to Macs for the CS professors university machines (the IT guy was Pro-Mac and convinced them).

Many schools might teach you Java to learn Object Oriented programming. Java is good, but the job opportunities are far fewer so make sure you get a good background in C/C++. Learning .NET helps too but I doubt many software engineering programs are going to actually teach it. You shouldn't be learning it anyway when you start to program.

I graduated with a BS in Software Engineering in Dec. 2005. Now I do DOORS Database Administration for an Air Force contractor and write custom tools in DOORS' scripting language to make other people's jobs easier. Finding a niche like this is a good way to make sure you have a job in the future.
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
I have a masters in software engineering and I work for private money lender.
Know this well enough to do it in your sleep:
(OO) C++/Interfaces (not GUI type yet)/Templates (generics in C#), the full software development life cycle, verification and validation, CASE tools, Databases queries in the database environment and in the programming environment, multi-thread programming and delegates.

Then after you master all that, then you can specialize in .NET, Linux, etc. Get certifications (MCSD, etc). The schools don't teach anything like in the real world (I did everything in a UNIX Shell), so you need lots of experience anyway.

Also, FINISH a 4 year University - don't shoot yourself in the foot. Some good programmers didn't finish college, their potential is greatly reduced and they are probably paid less (since companies can get away with it).

Just get any Windows notebook is good enough since they all have a LAN, modem, USB, etc connections. Mac users typically need special applications for graphics or music.
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815

Ya, most colleges do run UNIX. But I think it has more to do with UNIX and UNIX apps change very slowly. It will be hard for professors to re-learn everything every few years, like with Microsoft programming. UNIX is also a VERY good multi-user OS. It was Multi-user, Multi-tasking, Multi-processing 20 years before Microsoft. I got lots of core-dumps programming C++ with dangling pointers, the UNIX terminal window didn't even blink. I also agree colleges hate Microsoft and other big business, but I don't call it ultra-liberal (Since UNIX and Universities was doing advanced work decades before Microsoft), so I call it conservative. Companies still use Microsoft and you have to know visual studio unless you work for some large defense company that avoids Microsoft. Also, like others said, don't expect a lot of games on the Mac.
Linux is used a lot in web-work. Around 50% of the worlds web servers are Lunix. So if you (software engineers) are doing web applications (not web-pages), learn Apache and IIS. You need to be good with databases anyway.
 

moocow

Distinguished
Apr 7, 2004
66
0
18,630
I think it's much easier to learn from resources online than out of a book. There are so many tutorials and sample code available online, using a search engine to navigate is much easier than thumbing through a book, and you can cut and paste code.

I'd start off learning C++ and programming from a UNIX-like command prompt, rather than some OS specific programming environment. This is probably where you'd start in school. Use g++ (GNU c++) as your compiler and a text editor such as emacs. You can run these from a command prompt on Linux or Mac, or by downloading Cygwin for windows. Start off by following the tutorials at a page like http://www.cprogramming.com/tutorial.html#c++tutorial

Then once you're comfortable with the basics of the language and of programming in general, start trying to make a real program. Do something that's fun or interesting to you. Things like object oriented design, modularity, verification and design, databases, etc. might be practical for a professional software developer, but are dreadfully boring. So don't worry about those types of things starting out, and just try to do something fun so you can keep motivated. Getting general experience programming, analytical thinking, and troubleshooting is most important thing.

Start off by taking sample code similar to what you want to do from online, then compile and run it. Then start trying to modify it or combine it with other sample code online. For example, if you are interested in making games, you could stay cross-platform and learn OpenGL from a site like http://nehe.gamedev.net/
 

chadsxe

Distinguished
Aug 14, 2007
111
0
18,680



I agree with 100% percent about learning from online resources. What I have found school to be good for is it tells me what I don't know. Once I figure that out the internet has all the information that I need to fix that problem.

I regards to starting out on g++ from emacs. Ehhh....I am not sure if I agree. There really is no benfit to a new comer. To me it actually makes things harder because you don't have the assets avaiable in a good IDE. Espically something like Visual Studios and the debugging enviroment it offers. At one point < 2003 I would say yes because how much more strict g++ was then vc++ (s) compiler. But that is just not the truth anymore.


 
Ya, most colleges do run UNIX. But I think it has more to do with UNIX and UNIX apps change very slowly.

I think that's more of an issue with large companies that used to run or still run a lot of big-iron servers, especially IBM units. The rest of the legacy stuff is Win32/DOS rather than UNIX.

It will be hard for professors to re-learn everything every few years, like with Microsoft programming.

The GUI parts change frequently, but the non-GUI parts don't. You can use GCC just as easily as Microsoft's compilers to compile the code, and it's much cheaper to have a few Linux/BSD/Solaris machines around that are supported by the existing IT staff than to buy a bunch of MS licenses. This is much more of an issue in colleges where there isn't a university-wide all-of-our-products site license.

UNIX is also a VERY good multi-user OS. It was Multi-user, Multi-tasking, Multi-processing 20 years before Microsoft. I got lots of core-dumps programming C++ with dangling pointers, the UNIX terminal window didn't even blink.

This is another reason that it's popular. I personally much prefer to work with UNIX-type systems due to the fact that they'll tell you much more of what's going on and handle crappy programming better than on Windows. I also have dealt with forgetting to free pointers and my Linux machine handles it waaaay better than the Windows machines.

I also agree colleges hate Microsoft and other big business

I don't. My university and several others in the state that I know their position on *love* Microsoft and don't like the free UNIXes very much outside of the Web server and network traffic handling role. There are a handful of Linux machines around campus and one server that handles all of the CS SSH traffic as well as SAS/SPSS and file serving. Almost everything else is Windows- Exchange, IIS, etc. There are a fair number of Macintosh computers around, but most of those are x86 units that dual-boot Windows. They only exist because of the journalism school and for some reason the biology department is all Macintoshes.

...but I don't call it ultra-liberal (Since UNIX and Universities was doing advanced work decades before Microsoft), so I call it conservative.

I would agree that universities that deploy mostly UNIX and UNIXy machines are conservative- financially conservative. MS licenses are not cheap and you always run the risk of getting audited by the BSA if some disgruntled employee wants "to get even." Even if you're 100% legit- and with thousands of computers, you'll likely have at least a few that aren't- it's still a pain in the butt and expensive in terms of lost productivity. You are also beholden to their license agreements and any increases in fees that they deem you should pay. The open-source UNIXes do away with almost all of that. There are also no fees to pay for antivirus subscriptions and the UNIX OSes are generally thought to be more reliable, reducing the number of IT staff that you have to pay to administer and fix the machines, but that's just icing on the cake. So I'd consider the liberal choice to choose to run more expensive software that has a much greater liability in running it, and that's MS, not the UNIXes.

Companies still use Microsoft and you have to know visual studio unless you work for some large defense company that avoids Microsoft.

It depends on what you want to do. There are a lot of UNIX machines out there- mostly as servers- and *somebody* had to develop applications for them and administer them. Also, it's becoming increasingly common to encounter non-Windows machines being used in businesses for the reasons I outlined above, and again, somebody will need to develop for them. The bottom line is that it's good to know how to do work with several kinds of machines as it makes you more versatile and thus more attractive.

Also, like others said, don't expect a lot of games on the Mac.

That's completely irrelevant to the argument about universities. I haven't seen any games on any computers owned by any university that are more complex than Solitaire or Pinball, so the argument is moot. This might be more important to a student, but if you want to game, consoles are less expensive in the long haul than computers.

Linux is used a lot in web-work. Around 50% of the worlds web servers are Lunix. So if you (software engineers) are doing web applications (not web-pages), learn Apache and IIS. You need to be good with databases anyway.

Linux is also *the* HPC OS, as well as being extensively used for databases (especially large ones), file serving and e-mail. Windows servers tend to be used more frequently to run MS-only programs like Terminal Services or Exchange rather than the uses mentioned before, although IIS is used extensively for corporate intranet pages. Generally Windows servers will be used for file and Web serving if there is really only need for one server and you have to run Exchange or Terminal Services, so it's more of a "hey, we have this server already" kind of bit than a choice to use it. Of course there are exceptions, but this is what I've seen.
 

Redline24

Distinguished
Apr 19, 2006
107
0
18,680
I'm in the same situation. I'm waiting for the MacBook (non-Pro) refresh, which should allow me to drop 4gb of memory and a nice 250gb hard-drive into the laptop without a problem. Also, slightly better intergrated video offering SM 3.0. (I'm not a gamer, I just want something for OpenGL or another library, if needed).

The build quality of MacBooks is much better than any laptop I've seen out there - LED backlighting, longest battery life on WiFi (802.11n), quiet, next to no failures for most people, and the magnetized power charger. Although Dell is significantly cheaper, the build quality of their laptops isn't as good.

Thirdly, I like how small and light the MacBook is. All I need is a small screen (12") with a decent resolution. No dedicated video or large (15"+) screens for me.

Why can't one just use Boot Camp with Vista/Linux?
 

bliq

Distinguished
These are great responses. The only thing I would add would be that in my experience, nothing has replaced classroom and books for learning the basic concepts of programming (conditionals, loops, etc) which then can be extended by using online resources to learn the actual syntax. I think it's important to use the text which has been well thought out to learn all the intricacies of these constructs before applying them to real programs. That's why I always tell people to start with something like qbasic to learn the basic structures, then they can use that as a jumping off point to C or perl or whatever.

I learned the basics when I was 8 or 9, writing basic on a commodore pet and commodore 128.

From there, it was easy to just learn syntax to create javascript, VB, php, perl, etc.
 

wolverinero79

Distinguished
Jul 11, 2001
1,127
0
19,280
I agree with the idea of learning the basics first (i.e. non-graphical C++), and then expand into other realms after having that under your belt. I was kind of surprised to hear that some colleges do little C++ and go straight to Java, as I feel you learn more of the underlying structure of the system by doing C++ than by using Java.
 

russki

Distinguished
Feb 1, 2006
548
0
18,980
I would echo what others have said about learning C++, maybe even C (easier and more fundamental even, and stricter syntax). I am a sucker for Stroustrup (author, also creator of C++), but many think it's a bad starting point - too advanced to be the first read. May be the case; when I picked it up I was comfortable with C...

Basic is the easiest to learn, but I would advise against it for various reasons like loose syntax etc. It is best to learn strict syntax first, and then move on.

I would also suggest an IDE. Much more readable code organization and superior debugging tools (and you'll need them at first because of a lot of logic errors, etc.). Edit: I am not suggesting to use extensive GUI when learning, but to use the IDE tools while writing basic non-GUI oriented code.

Also, try to find a good book on algorithms and data structures. They are the basis of good programming. I find online resources to be a good reference tool but a poor learning tool, personally.
 

Curious1

Distinguished
Jun 6, 2005
33
0
18,530
Some people on this forum really need to stop with the infantile Mac bashing especially since they don’t know what they are talking about in the least. FUD corrections are as follows:

“Why in the world would you want to buy a more-expensive MacBook Pro to just pay yet more money to buy a full copy of Windows to run it in emulator at reduced performance?”

Boot Camp is not an emulator and Windows runs at native speeds on Apple hardware. There are some reports that claim that Windows RUNS FASTER on Apple hardware. Seeing as Apple has totally embraced PC hardware from their Intel CPUs to their ATI and nVidia GPUs to USB 2.0 to EFI that was developed by Intel, Macs are as much PCs as … well … PCs.

“…why would you buy the fischer price model? Apples are fantastic - if you surf the internet, write e-mail, and don't really know what a CPU is. If you play games, program, etc., you'll want a Windows based machine.”

Funny, I remember when Windows XP was called the Fischer Price model! I repeat again, Macs use the SAME DAMN INTEL CPUs! Moreover, Macs run Windows natively which means GAMES! Jesus, how many times are people going to repeat this FUD?

Lastly Macs can run all three major operating systems from Windows XP (native and virtualized), Linux (native and virtualized) and of course OS X. Moreover, Leopard is fully UNIX compliant. Certification means that software following the Single UNIX Specification can be compiled and run on Leopard without the need for any code modification.

The only argument against getting a Mac is that they are usually more expensive than your basic Dell. NOT insanely more expensive but more expensive nevertheless. With Dell’s coupons you can pay at least $200 dollars less than for a comparably feature laden Mac. But of course, you do get what you pay for and Macs definitely have a better build design than a Dell. And this is coming from a guy with two Dells who has used, installed and programmed on Windows since DOS.



Relevant links:

http://www.apple.com/macosx/leopard/technology/unix.html

http://www.apple.com/macosx/bootcamp/
 
Some people on this forum really need to stop with the infantile Mac bashing especially since they don’t know what they are talking about in the least.

LOL, infantile Mac bashing. Since that was my message that you quoted below, I'll let you know that I am *not* an "infantile Mac basher." I'm not an absolute expert on OS X since I don't use it on a daily basis, but I've certainly used it enough to get a decent feel for it as well as being skilled enough to help people who are regular Macintosh users set things up or fix them.

Boot Camp is not an emulator and Windows runs at native speeds on Apple hardware. There are some reports that claim that Windows RUNS FASTER on Apple hardware. Seeing as Apple has totally embraced PC hardware from their Intel CPUs to their ATI and nVidia GPUs to USB 2.0 to EFI that was developed by Intel, Macs are as much PCs as … well … PCs.

First off, most Macintoshes are personal computers and always have been- the Xserves and the old-school PPC and 68k beige-box servers being the exception. So drop the "Mac vs. PC" bit or we'll have to hit you with a clue-by-four until you do.

Secondly, I didn't see anywhere I said that Boot Camp was an emulator at all. It is in fact a BIOS emulator, but that makes virtually no difference to the OS once it is running as the OS assumes control over almost all of the hardware functionality. Something like VMware, QEMU, and Parallels are emulators, but that wasn't being discussed. What was being discussed is installing Windows on the bare metal of an x86 Macintosh (using the Boot Camp BIOS emulator.) It achieves the same effect as any other Windows laptop but at a generally higher price and requires that Boot Camp be installed. That's what I said was a little goofy- it pretty much defeats the purpose of buying the Macintosh in the first place, which is the OS. The only other difference between a Macintosh and any other computer is the hardware, and you just said that it's the same as any other computer. So there isn't really a logical reason to spend more money to get a Macintosh. If you get a Macintosh just to install Windows on it, you probably did so for some other reason- brag to your friends, you like how the Macintoshes look, etc.

“…why would you buy the fischer price model? Apples are fantastic - if you surf the internet, write e-mail, and don't really know what a CPU is. If you play games, program, etc., you'll want a Windows based machine.”

Funny, I remember when Windows XP was called the Fischer Price model! I repeat again, Macs use the SAME DAMN INTEL CPUs! Moreover, Macs run Windows natively which means GAMES! Jesus, how many times are people going to repeat this FUD?

The previous quote was a little dumb, but your response simply reaffirms what I said above: if you want to buy a Windows-based computer, it's easier and cheaper just to get one set up that way.

Lastly Macs can run all three major operating systems from Windows XP (native and virtualized), Linux (native and virtualized) and of course OS X. Moreover, Leopard is fully UNIX compliant. Certification means that software following the Single UNIX Specification can be compiled and run on Leopard without the need for any code modification.

Technically, an unmodified x86 Macintosh can only run Linux, BSD UNIX, and MacOS X as no other OS has EFI support. But the same OSes will run on a PowerPC Macintosh too, so there's no big deal there. You can hack the EFI to allow OSes without EFI support like Windows XP and DOS to load and that is the big deal with the computer. But you can run Windows, Linux, and OS X on any other x86 computer with SSE3 support. There's no technical limitation, only that Apple DRM'ed some of the core components to prevent this. It's not an "it won't run on any other machine" situation, it's a "they won't let me run it on any other machine, but it will in fact run" situation. The OSx86 guys found that one out pretty quickly.

Yes, MacOS X is UNIX compliant because it is a BSD UNIX distribution with a proprietary graphical subsystem in place of (and sometimes alongside) X11 and a proprietary DE. You can compile UNIX applications for a Macintosh without any code modification if and only if you can gather and install ALL linked-to libraries on your system. I can send you the source code for another UNIX's program, such as GNU Octave, and you won't be able to compile and run it until you install X11, GTK, and a whole host of other libraries. It is possible, but it can be difficult to the point that an application is effectively non-portable.

Oh, and one more thing: the above is referring only to programs where you can get all of the needed libraries. This means that a GUI program you write on your Macintosh might technically be UNIX compliant but be impossible to run on my Linux machine as I cannot get a hold of the Quartz or Cocoa libraries that I'd need to compile and run the program.

The only argument against getting a Mac is that they are usually more expensive than your basic Dell. NOT insanely more expensive but more expensive nevertheless. With Dell’s coupons you can pay at least $200 dollars less than for a comparably feature laden Mac. But of course, you do get what you pay for and Macs definitely have a better build design than a Dell. And this is coming from a guy with two Dells who has used, installed and programmed on Windows since DOS.

Macs are more expensive- I've been saying that all along. But the build quality bit is very subjective and the dozens of people I know that have owned Macintoshes, especially MacBooks, have had plenty of problems and the build quality has sucked. The original Core Duo MacBooks ran far hotter than my old flamethrower P4-M laptop did. MacBooks invariably look terrible after several months once the keyboard rest either turns skid-mark-brown (white models) or the black paint flakes off (black models.) The metal-case MacBook Pros wear better, but they are *much* more expensive than the basic Dells and HPs and compete with the corporate-class machines. I've seen Dell Latitudes basically thrown around for a couple years and not look the worse for wear, ditto with many ThinkPads (exc. some x60s, the screen bezels are crap) and HP/Compaq business machines. I'd say that the others are as good or better as the Macs from what I've seen, and I've seen hundreds of notebooks over the last several years.

Plus, Apple has a very narrow range of notebooks and a very limited feature range. They have a 13.3" consumer unit and a 15" and 17" corporate unit. Most other OEMs have 12.1" and 14.1" units in addition to the sizes that Apple has, and a few have 11" and smaller units and 19-20" behemoths. Many also have tablet PCs, which have some traction in certain fields such as education and medicine. Apple has very limited hardware features in their computers, too. None of them have a straight HD-15 RGB port on them for projectors- the Pro models have DVI-D (you need a DVI-D to RGB adapter) and the consumer models have a proprietary port and require that dongle that most people forget when they need to hook up to a projector. No Apple notebooks have a memory card reader, something that's pretty handy and almost ubiquitous on other notebooks. The notebooks also can't handle an extended battery as they battery sits directly underneath the middle, so battery life is limited. Other notebooks have a wide range of battery sizes and some really take advantage of the ability to poke out of the back or front of the unit. These traits commonly make Apple notebooks as a whole less than desirable for the money once they're compared with comparably-priced units from other OEMs.
 

ches111

Distinguished
Feb 4, 2006
1,958
0
19,780
Here is the deal...

Knowing how to write code in C++ is only a small part of the battle.

Knowing what effects you make to the system with that code is another large part of the battle that is oft forgotten these days.

Get yourself a UNIX INTERNALS book and have a good read. From there install any distro of LINUX on your newly acquired Windows machine and go to town.

Why Linux? Because windows developers tend to make less money than those who work with/Know Unix variants.

Things you learn in the Linux dev world will directly apply to your Windows dev world (.net aside). Then learn a JAVA. YOu acn even use Eclipse as a Graphical Development Environment within Windows AND Linux.

Most Devs these days have little to NO understanding of what is happening under the covers. How to optimize code. What word alignment is and why it is important to certain machines and not so important to others.

Many devs probably do not know what an inode is anymore. They don't know why they should or should NOT inline something.

ANd many certainly do NOT know how to efficiently design code let alone a parallel application.
 

moocow

Distinguished
Apr 7, 2004
66
0
18,630
Well, a lot of the responses in this thread were senseless Mac bashing. I've never owned a Mac myself, but buying a Macbook and running Windows as a virtualized OS, as the original poster mentioned, is a perfectly reasonable thing to do, if you have the money to do it. I know a lot of people who are software engineers, or computer science students, or physicists that do a lot of coding who own Mac laptops. Usually the reason is that they are used to using Linux, but some things in Linux like power management often doesn't work very well on laptop hardware. Since OSX is UNIX based it offers many of the tools they want.

What it essentially means is that you are paying money for software-- the OSX GUI and its core programs, the command prompt and UNIX functionality, etc. These are the things you will use most of the time, and if there is a Windows program you simply have to be able to run, then you can run it in a virtual OS. Perfectly valid thing to do if you are running native OSX apps more often than Windows apps. I don't see how people can be so opposed to paying more to be able to run OSX and Windows, and then suggest that this poster go spend at least $250 to buy Microsoft Visual Studio.
 

russki

Distinguished
Feb 1, 2006
548
0
18,980


I would argue with you in that you need to crawl before you can fly. I agree with your larger point, but first it would be good to understand the language, and then go an understand Unix. Unix is not trivial.

Ultimately, of course, you are right. In order to be an efficient programmer you really need to know the system you are programming for / on. That holds true regardless of the platform; knowing how Windows works is required to write effective Windows code. Yes, yes, C (and I believe originally it was C and not C++, but do correct me if I'm wrong) go hand in hand, but I would suggest that things need to be taken in order, and the language is probably step 1. You probably still go back to the language.

And another thought; knowing the system is one more advanced parts of programming and thus should be one of the latter considerations. If the program logic is poor, no matter how well you know the system, it will remain a poor program. So I would say a good program comes from an effective combination of logic, understanding of the platform, and good coding practices.
 

chadsxe

Distinguished
Aug 14, 2007
111
0
18,680


So do you care to give some reasoning to your statment. You might not prefer MVS but it is kind of hard to deny just about one of the best IDE's out the market as an option all together.
 

chadsxe

Distinguished
Aug 14, 2007
111
0
18,680



I would also suggest sticking with one language for a while instead of jumping around. A lot of colleges like to keep throwing new languages at you without ever letting you get your feet wet with just one. There is soooooooooooooooooooooooOOOOooooo much to be learned in regards to c++. Not saying you have to learn it all but it is of benefit to at least become well versed. Once you get you head around one language the majority of ideas will translate to most other languages. The reason why I recommend c++ over others is that it encompasses just about every idea there is out there.
 

chadsxe

Distinguished
Aug 14, 2007
111
0
18,680



It would seem this way....

I think it has to do with the fact that people are often put into the "Specialized" role in the work world. I can't count the amount of people I have come across that carry the "developer" title but could not code some basic data structures or like you said optimize based off the platform. Which is all fine and dandy with me because it makes me more money in the long run.

Bottom line...anyone who is thinking of entering the wonderful world of software development in any of its multiple forms needs to realize that there is a hell of large dedication to learning the needs to be had. There is flat out no way one person could ever learn everything there is to learn. Find a niche and role with it. But by all means don't limit yourself to the high level explanation of things. Dig deep and search for those low level answers.
 

hassa

Distinguished
Mar 5, 2007
140
0
18,680


Forget all the Unix vs Microsoft stuff - it doesn't make any difference. Here's what you need to do

1) Pick a language to use for learning - personally I would stay away from C initially - too many issues with pointers etc... I learnt on pascal in mid to late 90s. However java or C# are solid modern languages.

2) Do simple command line programming - e.g. Hello World.

3) Learn how to declare variables etc...

4) Start learning simple constructs such as if statements and loops. Write simple apps like converting temperature from C to F or F to C, or KM to Miles to Miles to KM.

5) Move onto more complicated structures such as case statements, arrays and so on. Then start looking at doing simple things like file access.

Once you have learnt all of these, you will have a solid grounding. This is where I was after 1st year Uni

Then you start learning things like OO programming, GUIs, database programming (SQL) and so on.

A Uni degree is normally structured to teach you in steps - you end up taking lots of little steps.

Then you go and get a job and learn all sorts of things they don't teach you at Uni. It will not be a 5 minute process.
 

russki

Distinguished
Feb 1, 2006
548
0
18,980

That's an interesting debate. I would still suggest starting with C. Yeah, pointers are not an easy thing to get a handle on, but that is also more or less a non-language issue. For example, Pascal has pointers, too. And that is a nice, structured language; too bad it's dead. C, on the other hand, is the foundation of most of the modern languages; one could even argue that the modern Basic is the evolution of the traditional basic into something that is more like C (well, maybe Pascal, in that it differentiates between funcs and procs, if I recall correctly).

Anyway, C is structured enough to force somebody into strict syntax (C++ to a lesser extent), and you could stay away from pointers and memory management initially, which is how most books are written anyway. And one can never hide from complicated issues forever...
 

TRENDING THREADS

Latest posts