Sort file content in Linux

Sometimes when we are working with files in Linux, there is a certain time that we need to sort content of files. There is a situation when we are copying or append a new file containing domains that we want to block in Squid block domain file, for example. Normally we would do that using 'cat new-block-domain-file >> /etc/squid/blockeddomains.acl' command. Suddenly, we got error when restarting the Squid daemon saying there's a duplicate domain in the blocked domains file. The way to solve this duplicate content in file is to sort and remove using Linux sort command and uniq command.

Here are solution command to sort and delete duplicate entry in the content of file in Linux using sort and uniq command:

sort filename | uniq -u > newfile

In the example above:

  • filename = file to be sort
  • newfile = new file name

We need to save output to a new file because the sort command just print the output to the screen. If we want to permanently delete the duplicate entry in our file, delete the file with duplicate content and replace it with the new file that we just created with the command above.

That's it. Hope it helps some new Linux users out there.

If you want to read more about Squid cache proxy server, here is the link: Install and configure Squid in Slackware



Another option is to use 'sort -u' and then redirect the output to a new file:

sort -u file > new-file

Sort is pretty basic command. I learned other basic commands at

Add new comment

Filtered HTML

  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <blockquote> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
By submitting this form, you accept the Mollom privacy policy.