Recently by mistake I deleted some usefull data. After that I started to look for file recovery options. I search variaty of document and product if any of them recover my file and data. Finally, I find TestDisk and PhotoRec ,and it was able to use the latter to recover my lost files.
Copying files over the network has always been a need. scp and rsync are two very useful tools for this purpose. In fact, for most purposes, I prefer rsync because it works smartly and quickly, with minimal usage of resources. In cases where data has already been copied and only changes need to be updated, rsync is perfect.
Normally rsync can be done using the SSH port (22). If you can SSH to a machine, you can rsync to it just as well. But what if you have to rsync content from your machine to a machine inside a remote LAN, a server not accessible from machine B directly? I found myself in this situation. rsync + ssh port forward did the trick as follows:
+-------------+ +-------------+ +-------------+ +-------------+
In this Article we will explore SVN and its implementations using our favorite OS Linux.
There is a lot of information available about SVN .I wont repeat it again here, neither will I talk about advanced SVN terms, schemas. In fact i do not know much about them. Instead I will try to explain in very simple words about what SVN is, what its benefits and how we can use it.I am not an SVN expert. In fact I am an SVN beginner. I will basically just write here what I do with SVN and how. I will try not to confuse you like I was for a very long time.
Tapped the power of shell filters using the following code to get the 10 smallest packages from the Sources file of the sources of a debian repository:
grep -A 3 '^Files:' Sources | grep tar.gz | tr -s ' ' | cut -d ' ' -f 3,4 | sort -n | head
Since I was working on a shell on the repository itself, which seldom happens to most people, I could use this command. But even from your machine, you can find the largest source package using:
apt-cache showsrc $(dpkg -l | tr -s ' ' | cut -d ' ' -f 2) | grep -A 3 '^Files:' | grep tar.gz | tr -s ' ' | cut -d ' ' -f 3,4 | sort -n | head
None of us owns our earth. We all share this planet and it's resources. Many of our daily activities have an impact on our environment. I passionately believe in preserving my planet for our children. One of the ways to do this is to avoid using more than what I need. Do you know that your computer monitor takes 80% of power? You can easily switch off your PC monitor, but what about your laptop monitor?
There's a very useful command, which will help you switch off your laptop monitor:
Due to some misconfiguration in ReocrdMyDesktop Application (Application to record desktop activity), My root partition got eaten up to 100%. Nothing was working, except Virtual Terminal (That was taking more then 10 minutes to launch and logged in too:(). So, i went in flash back to find which can caused this problem, By mistake ReocrdMyDesktop was running behind the scene, which was eating my CPU and Memory, I have found thorough TOP utility and killed it :), But point is how should i find the file/directory, which is largest in disk generated my RecordMyDesktop, so i can delete it and free some disk memory, No, I was not aware with RecordMyDesktop nternals :(.
Now Question is: How to find maximum file or directory size in linux with sorting?
Here we go....