(yet another) Which version of Linux for me?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bmouring

Distinguished
May 6, 2006
1,215
0
19,360
Nono, I just know that some people don't really care to know too much about this kinda stuff (they want it to "Just Work TM").

Basically, as the filesystem portion of the kernel is working with files, quite often all this is happening in system memory and not the actual disk (as you probably know). When this data does get flushed to the disk, if some of the structural data of the filesystem is only partially written to the on-disk structures it can lead to broken structures (corrupted data, bad pointers, you get the idea). The idea of a journal is to write down an atomic group of actions to dispatch (or a cache flush).

The order follows as such:
1) Write the actions to a journal
2) Upon completion of the jounal dispatch unit, mark it as completely written to the journal
3) Begin modifying the actual on-disk structures
4) Upon completing, mark the dispatch unit in the journal as completely written to disk
5) Begin loading new unit to flush.

If the system crashes before 1) All data is lost and the jounal record isn't valid (since there's nothing in it) (XFS caches a lot of data, as such in the even of a crash that's why a lot of data is lost). A little data loss but life is good.

If it crashes between 1) and 2), there's actions to perform in the journal but it hasn't been marked as "good", so it could lead to an inconsistent system (think of first moving a file chunk and then updating the pointer to it, if it crashes between those steps you are now missing data) and as such, the journal is ignored. Life is good.

If a crash happens between 2) and 3), we have a good journal but nothing's taken place on the actual disk. On restart, the actions listed in the journal are replayed and the journal is marked "completely written to disk". Life is good.

If a crash occurs between 3) and 4), we have partially updated the on-disk structures but not completely. As such, like above, the entire journal is replayed. Some file data might be lost, but the filesystem is coherent. Life is again good.

If a crash occurs between 4) and 5), on next boot the filesystem portion of the kernel sees that the last transaction was complete, even if the filesystem itself is marked "dirty" (when unmounting a filesystem, it is marked "clean" to let the kernel know that a check is not necessary). The replay is skipped (as there is nothing to replay) and life is good.

Now, some really cools stuff is the so-called log filesystems, they use this same concept but with the entire filesystem instead of just the metadata of the filesystem. Updates to file data result in moving the data as a whole and the log keeps track of the location of the file contents as they move around. As you can probably imagine, since old data isn't initially overwritten (it is later by a "garbage collector" of sorts), it allows for trivial implimentation of filesystem-wide versioning.

As boring as it might seem to some, as you could probably tell, I think filesystems are pretty damn cool.
 

zyberwoof

Distinguished
Apr 6, 2006
135
0
18,680
[code:1:8a44ba4da6]//$ = User #=root
$ xhost +
//reply about allowing access to the X server
$ su
//Enter the root password
# qtparted
//should open a window with qtparted in it. Go from there.[/code:1:8a44ba4da6]

also note that the parted familiy of apps if for making of partitions, not formatting the disks. In order to do that, use the various mkfs commands.

[code:1:8a44ba4da6]man mkfs[/code:1:8a44ba4da6]


Allright, right now I am trying this. I switched users so that I am root by typing in "sudo su -" and then giving the password. However, every time I type in xhost or xhost + it says "unable to open display." Then if I try typing in qtparted it says "cannot connect to x server." I am assuing it says that since running xhost gives me the error it does.



*** UPDATE ***
After a bit of work I got GParted installed and working, so now my partitioning and formatting is underway. I am still curious why the xhost thing was not working. Oh well, progress is being made!



*** UPDATE 2 ***
I was having trouble mounting all of my hard drives, but I found a great guide that walked me through doing it through the command line, so I feel that much wiser now.

A new problem I am facing is how to get my Windows computers on my network to be able to see the drives I have shared on my xubuntu pc. I ran the installed "shared folders" program and added a few folders, but I am not able to even see my linux pc from the Windows PCs when I go through "My Network Places."



Current Questions:
1. What is the deal with xhost? Why is it that when I do sudo su - and then xhost + it is giving me that error? I am assuming that xhost is something I will need for many Linux applications.

2. How do I set up my Linux PCs so that they are visible from my Windows PCs?
 

bmouring

Distinguished
May 6, 2006
1,215
0
19,360
Sorry, shoulda been clearer here, you run xhost before changing to root, allowing for root to connext to your X server, not after.
[code:1:ab4e252826]brad@the-uberbeast ~ $ xhost +
access control disabled, clients can connect from any host
brad@the-uberbeast ~ $ su
Password:
the-uberbeast brad # gnomine
//Gnomine opens[/code:1:ab4e252826]

Also, I rarely use this trick, as most of my root work is text-based.

As for installing and setting up the samba server stuff, here you can find a great, (x)ubuntu-specific guide for doing just that. If you have additional questions, of course don't hesitate to ask.

Cheers.
 

zyberwoof

Distinguished
Apr 6, 2006
135
0
18,680
Ah I see. As time goes on I am going to try to work more and more with the command line and less with the gui for my administrative work. But for now, I'm stuck with gui's since they are more user friendly.

This site is one someone recommended in the forums and it seems to be really good for those just begining with ubuntu. It has step by step guides that walk you through some simple (but important) tasks all from the terminal. Also, there are not too many guides on the site, just ones most beginners need. That makes it much easier to use the site and find what you need. This site may be a good one for ya'll to recommend to all of those begining ubuntu users.

The file system information was very interesting in my opinion. It was more confusing than it should have been since I read it at the end of the day and my mind was getting a little bit slow. Either wy it was a good read and every little thing like that may help me in the future once I finish school. I had not given the file systems computers use much thought before (probably since with Windows it was simple: use NTFS), but I find them along with all other data structures interesting.

Thanks for the help! I may be back for more help later.
 

linux_0

Splendid
bmouring nailed it! :-D


Of course you could also go to the source ;-)

http://us1.samba.org/samba/

http://us1.samba.org/samba/docs/man/Samba-HOWTO-Collection/

http://us1.samba.org/samba/docs/man/Samba-Guide/

Be sure to check out rsync it's a life saver!

http://samba.org/rsync/

:-D
 

zyberwoof

Distinguished
Apr 6, 2006
135
0
18,680
rsync looks perfect. The files I am backing up are mostly video files, probably around 350 - 400 GB right now and going up. Not only is the automation what I want, but it would definatly be best if the backup was incrimenta and didn't just replace every file (that would take a LONG time, expecially since the PC with the movies on it may be connected wirelessly).
 

bmouring

Distinguished
May 6, 2006
1,215
0
19,360
Yep, when in school I use it to keep my desktop (where I do most of my work) and laptop (where I take all of my notes and do some work) in sync. Works like a charm.
 

linux_0

Splendid
Yup yup :-D

[code:1:dc0e812b0f]
rsync -artlpzv -e ssh /home/$your_username "$your_username@192.168.0.20:/home/$your_username/"

# /home/$your_username is the source directory on the local machine

# "$your_username@192.168.0.20:/home/$your_username/" is the remote machine and path in SCP syntax
[/code:1:dc0e812b0f]


PS Remove the z from -artlpzv if transferring already compressed files on a LAN.

Wireless is convenient but it is super slow ( 54Mbps ) compared to gigE ( about 1062Mbps ) and is completely insecure even with encryption turned on.

54Mbps is max theoretical -- your mileage will vary but if there is interference or other issues it may slow down to less than 11Mbps.

1062Mbps is max theoretical -- you can usually push about 251Mbps ( 30MB/sec depending on your hardware, software and settings ).
 

zyberwoof

Distinguished
Apr 6, 2006
135
0
18,680
Trust me, I know about the woe's of file transfer over wireless. If I could drill the holes, I would run cat 5, but I can't. Fortunatly, the biggest backup (the first one) will be done over a wired 100 mbps connection, so that will be fine.
 
One warning with using rsync- it can be set to just update changes to a file and that has led to some corrupted files on my machines- d'oh!! I always tell it to replace the old file entirely with a the updated one and that has saved me a lot of trouble. (It does not replace ALL files unless all files are changed- only the changed files.) I have a laptop and desktop that I keep synchronized with the help of a little tool called Unison that is a graphical frontend to rsync and by default, it checks for changes between two directories (which can be other computers or USB sticks, external HDDs, etc.) and replaces entire files. It will also checksum files too, so it ensures that the transfer went smoothly. Both machines run Linux, so I share /home/myname/Documents on the laptop and mount it over NFS on the desktop and then run Unison on the desktop.

The machines talk over a 100 Mbit LAN as the laptop only has a 100 Mbps port while the desktop has a GbE NIC. Synchronizing hundreds of megabytes takes maybe a minute or so. It is so dang handy to be able to do that- I don't know how anybody could do without that. The computers talk through the LAN ports on a 54 Mbit wireless router, and the laptop has an 11 Mbps wireless card in it, whereas my girlfriend's and brothers' computers both run Windows, two have 54 Mbps cards in them, and one has an 11 Mbps card like mine (it's an identical machine.) Here's the speeds I see:

1 Gbps LAN-to-LAN: Not tested yet*
100 Mbit LAN-to-LAN: 12.3-12.5 MB/sec
54 Mbit WLAN-to-LAN: 1.9-2.2 MB/sec
54 Mbit WLAN-to-54 Mbit WLAN: ~1 MB/sec
11 Mbit WLAN-to-LAN: 380-430 KB/sec
11 Mbit WLAN-to-54 Mbit WLAN: 380-430 KB/sec
11 Mbit WLAN-to-11 Mbit WLAN: ~250KB/sec

*I estimate that my machine's maximum sustained throughput would be approximately 500 Mbps as the fastest HDD can pull 72 MB/sec sustained. Now the only computer that can actually talk to my desktop at gigabit speeds would be one brother's Dell notebook. It has an 80GB HDD that pulls about 30-35 MB/sec maximum and would thus likely limit the speed to that 200-250 Mbps range. However, the HDD is so fragmented (NTFS) that his machine does not usually saturate a 100 Mbps connection. Something like 8 MB/sec is what it usually will do, and little more.
 

linux_0

Splendid
One warning with using rsync- it can be set to just update changes to a file and that has led to some corrupted files on my machines- d'oh!! I always tell it to replace the old file entirely with a the updated one and that has saved me a lot of trouble. (It does not replace ALL files unless all files are changed- only the changed files.) I have a laptop and desktop that I keep synchronized with the help of a little tool called Unison that is a graphical frontend to rsync and by default, it checks for changes between two directories (which can be other computers or USB sticks, external HDDs, etc.) and replaces entire files. It will also checksum files too, so it ensures that the transfer went smoothly. Both machines run Linux, so I share /home/myname/Documents on the laptop and mount it over NFS on the desktop and then run Unison on the desktop.

The machines talk over a 100 Mbit LAN as the laptop only has a 100 Mbps port while the desktop has a GbE NIC. Synchronizing hundreds of megabytes takes maybe a minute or so. It is so dang handy to be able to do that- I don't know how anybody could do without that. The computers talk through the LAN ports on a 54 Mbit wireless router, and the laptop has an 11 Mbps wireless card in it, whereas my girlfriend's and brothers' computers both run Windows, two have 54 Mbps cards in them, and one has an 11 Mbps card like mine (it's an identical machine.) Here's the speeds I see:

1 Gbps LAN-to-LAN: Not tested yet*
100 Mbit LAN-to-LAN: 12.3-12.5 MB/sec
54 Mbit WLAN-to-LAN: 1.9-2.2 MB/sec
54 Mbit WLAN-to-54 Mbit WLAN: ~1 MB/sec
11 Mbit WLAN-to-LAN: 380-430 KB/sec
11 Mbit WLAN-to-54 Mbit WLAN: 380-430 KB/sec
11 Mbit WLAN-to-11 Mbit WLAN: ~250KB/sec

*I estimate that my machine's maximum sustained throughput would be approximately 500 Mbps as the fastest HDD can pull 72 MB/sec sustained. Now the only computer that can actually talk to my desktop at gigabit speeds would be one brother's Dell notebook. It has an 80GB HDD that pulls about 30-35 MB/sec maximum and would thus likely limit the speed to that 200-250 Mbps range. However, the HDD is so fragmented (NTFS) that his machine does not usually saturate a 100 Mbps connection. Something like 8 MB/sec is what it usually will do, and little more.




Windows is almost always slower than Linux transfering files.

I routinely get full wire speed, or close to it on Linux while windows often do half of that or less on the same hardware.

Wireless is quite slow as well regardless of OS.