How to filter out only fragmented files from the log?
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
Supposing I have just finished a defragmentation on ext4
file system on a HDD:
sudo e4defrag -v / > ~/defrag-2017-11-05 2>&1 &
Which is most probably unnecessary, but I wanted to see, which files have been fragmented.
The log looks like:
==> defrag-2017-11-05 <==
ext4 defragmentation for directory(/)
[1/403415] "/"
File is not regular file [ NG ]
[2/403415] "/usr"
File is not regular file [ NG ]
[3/403415] "/usr/share"
File is not regular file [ NG ]
[4/403415] "/usr/share/ppp"
File is not regular file [ NG ]
[5/403415]^[[79;0H^[[K[5/403415]/usr/share/ppp/chap-secrets: 100% extents: 1 -> 1 [ OK ]
[6/403415]^[[79;0H^[[K[6/403415]/usr/share/ppp/provider.chatscript: 100% extents: 1 -> 1 [ OK ]
[7/403415]^[[79;0H^[[K[7/403415]/usr/share/ppp/provider.peer: 100% extents: 1 -> 1 [ OK ]
[8/403415]^[[79;0H^[[K[8/403415]/usr/share/ppp/pap-secrets: 100% extents: 1 -> 1 [ OK ]
[9/403415] "/usr/share/backgrounds"
File is not regular file [ NG ]
[10/403415] "/usr/share/backgrounds/linuxmint-retro"
File is not regular file [ NG ]
[11/403415]^[[79;0H^[[K[11/403415]/usr/share/backgrounds/linuxmint-retro/Gloria.jpg: 100% extents: 1 -> 1 [ OK ]
[12/403415]^[[79;0H^[[K[12/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2441.jpg: 100% extents: 1 -> 1 [ OK ]
[13/403415]^[[79;0H^[[K[13/403415]/usr/share/backgrounds/linuxmint-retro/theaeffect_3.png: 100% extents: 1 -> 1 [ OK ]
[14/403415]^[[79;0H^[[K[14/403415]/usr/share/backgrounds/linuxmint-retro/multigons.jpg: 100% extents: 1 -> 1 [ OK ]
[15/403415]^[[79;0H^[[K[15/403415]/usr/share/backgrounds/linuxmint-retro/Felicia.png: 100% extents: 1 -> 1 [ OK ]
[16/403415]^[[79;0H^[[K[16/403415]/usr/share/backgrounds/linuxmint-retro/LinuxMint.png: 100% extents: 1 -> 1 [ OK ]
[17/403415]^[[79;0H^[[K[17/403415]/usr/share/backgrounds/linuxmint-retro/air.jpg: 100% extents: 1 -> 1 [ OK ]
[18/403415]^[[79;0H^[[K[18/403415]/usr/share/backgrounds/linuxmint-retro/curve.jpg: 100% extents: 1 -> 1 [ OK ]
[19/403415]^[[79;0H^[[K[19/403415]/usr/share/backgrounds/linuxmint-retro/fizzy.jpg: 100% extents: 1 -> 1 [ OK ]
[20/403415]^[[79;0H^[[K[20/403415]/usr/share/backgrounds/linuxmint-retro/silent_green.jpg: 100% extents: 1 -> 1 [ OK ]
[21/403415]^[[79;0H^[[K[21/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2112.jpg: 100% extents: 1 -> 1 [ OK ]
[22/403415]^[[79;0H^[[K[22/403415]/usr/share/backgrounds/linuxmint-retro/Emotion.jpg: 100% extents: 1 -> 1 [ OK ]
[23/403415]^[[79;0H^[[K[23/403415]/usr/share/backgrounds/linuxmint-retro/pr09studio_spring.png: 100% extents: 1 -> 1 [ OK ]
[24/403415]^[[79;0H^[[K[24/403415]/usr/share/backgrounds/linuxmint-retro/Talento-1.jpg: 100% extents: 1 -> 1 [ OK ]
[324150/403415]^[[79;0H^[[K[324150/403415]/home/ruzena/StaM-EM->enM-CM-)/Altitude.2017.DVDRip.XviD.AC3-EVO/Altitude.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[324290/403415]^[[79;0H^[[K[324290/403415]/home/ruzena/StaM-EM->enM-CM-)/Savage.Dog.2017.BRRip.XviD.AC3-EVO/Savage.Dog.2017.BRRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325184/403415]^[[79;0H^[[K[325184/403415]/home/ruzena/StaM-EM->enM-CM-)/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325356/403415]^[[79;0H^[[K[325356/403415]/home/ruzena/StaM-EM->enM-CM-)/Kong.Skull.Island.2017.TS.XviD.AC3-RUSSIAN.avi: 100% extents: 20 -> 20 [ OK ]
[352147/403415]^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 0%^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 100% extents: 5 -> 1 [ OK ]
[352943/403415]^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 0%^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 100% extents: 5 -> 1 [ OK ]
[354676/403415]^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 0%^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 100% extents: 5 -> 1 [ OK ]
[400977/403415]^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 0%^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 100% extents: 5 -> 1 [ OK ]
Since I don't have any experience with awk
and similar tools, I wonder:
How to filter out only fragmented files from the log? If even possible...
For specialists: If you could sort it by the most fragmented files, that would be awesome, but not a condition to answer this question.
A line I don't want to see end in:
... extents: 1 -> 1 [ OK ]
Lines I want to see end in:
... extents: 5 -> 1 [ OK ]
... extents: 20 -> 5 [ OK ]
Where I need to show only lines with 5
or whatever number there is on that very place, higher than 1 obviously.
EDIT:
Example output of the verbose defragmentation for you to try the commands on:
https://www.vlastimilburian.cz/public/linux/defrag-2017-11-05.bz2
Just extract it and you're good to go.
logs defragmentation
add a comment |Â
up vote
0
down vote
favorite
Supposing I have just finished a defragmentation on ext4
file system on a HDD:
sudo e4defrag -v / > ~/defrag-2017-11-05 2>&1 &
Which is most probably unnecessary, but I wanted to see, which files have been fragmented.
The log looks like:
==> defrag-2017-11-05 <==
ext4 defragmentation for directory(/)
[1/403415] "/"
File is not regular file [ NG ]
[2/403415] "/usr"
File is not regular file [ NG ]
[3/403415] "/usr/share"
File is not regular file [ NG ]
[4/403415] "/usr/share/ppp"
File is not regular file [ NG ]
[5/403415]^[[79;0H^[[K[5/403415]/usr/share/ppp/chap-secrets: 100% extents: 1 -> 1 [ OK ]
[6/403415]^[[79;0H^[[K[6/403415]/usr/share/ppp/provider.chatscript: 100% extents: 1 -> 1 [ OK ]
[7/403415]^[[79;0H^[[K[7/403415]/usr/share/ppp/provider.peer: 100% extents: 1 -> 1 [ OK ]
[8/403415]^[[79;0H^[[K[8/403415]/usr/share/ppp/pap-secrets: 100% extents: 1 -> 1 [ OK ]
[9/403415] "/usr/share/backgrounds"
File is not regular file [ NG ]
[10/403415] "/usr/share/backgrounds/linuxmint-retro"
File is not regular file [ NG ]
[11/403415]^[[79;0H^[[K[11/403415]/usr/share/backgrounds/linuxmint-retro/Gloria.jpg: 100% extents: 1 -> 1 [ OK ]
[12/403415]^[[79;0H^[[K[12/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2441.jpg: 100% extents: 1 -> 1 [ OK ]
[13/403415]^[[79;0H^[[K[13/403415]/usr/share/backgrounds/linuxmint-retro/theaeffect_3.png: 100% extents: 1 -> 1 [ OK ]
[14/403415]^[[79;0H^[[K[14/403415]/usr/share/backgrounds/linuxmint-retro/multigons.jpg: 100% extents: 1 -> 1 [ OK ]
[15/403415]^[[79;0H^[[K[15/403415]/usr/share/backgrounds/linuxmint-retro/Felicia.png: 100% extents: 1 -> 1 [ OK ]
[16/403415]^[[79;0H^[[K[16/403415]/usr/share/backgrounds/linuxmint-retro/LinuxMint.png: 100% extents: 1 -> 1 [ OK ]
[17/403415]^[[79;0H^[[K[17/403415]/usr/share/backgrounds/linuxmint-retro/air.jpg: 100% extents: 1 -> 1 [ OK ]
[18/403415]^[[79;0H^[[K[18/403415]/usr/share/backgrounds/linuxmint-retro/curve.jpg: 100% extents: 1 -> 1 [ OK ]
[19/403415]^[[79;0H^[[K[19/403415]/usr/share/backgrounds/linuxmint-retro/fizzy.jpg: 100% extents: 1 -> 1 [ OK ]
[20/403415]^[[79;0H^[[K[20/403415]/usr/share/backgrounds/linuxmint-retro/silent_green.jpg: 100% extents: 1 -> 1 [ OK ]
[21/403415]^[[79;0H^[[K[21/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2112.jpg: 100% extents: 1 -> 1 [ OK ]
[22/403415]^[[79;0H^[[K[22/403415]/usr/share/backgrounds/linuxmint-retro/Emotion.jpg: 100% extents: 1 -> 1 [ OK ]
[23/403415]^[[79;0H^[[K[23/403415]/usr/share/backgrounds/linuxmint-retro/pr09studio_spring.png: 100% extents: 1 -> 1 [ OK ]
[24/403415]^[[79;0H^[[K[24/403415]/usr/share/backgrounds/linuxmint-retro/Talento-1.jpg: 100% extents: 1 -> 1 [ OK ]
[324150/403415]^[[79;0H^[[K[324150/403415]/home/ruzena/StaM-EM->enM-CM-)/Altitude.2017.DVDRip.XviD.AC3-EVO/Altitude.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[324290/403415]^[[79;0H^[[K[324290/403415]/home/ruzena/StaM-EM->enM-CM-)/Savage.Dog.2017.BRRip.XviD.AC3-EVO/Savage.Dog.2017.BRRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325184/403415]^[[79;0H^[[K[325184/403415]/home/ruzena/StaM-EM->enM-CM-)/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325356/403415]^[[79;0H^[[K[325356/403415]/home/ruzena/StaM-EM->enM-CM-)/Kong.Skull.Island.2017.TS.XviD.AC3-RUSSIAN.avi: 100% extents: 20 -> 20 [ OK ]
[352147/403415]^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 0%^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 100% extents: 5 -> 1 [ OK ]
[352943/403415]^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 0%^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 100% extents: 5 -> 1 [ OK ]
[354676/403415]^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 0%^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 100% extents: 5 -> 1 [ OK ]
[400977/403415]^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 0%^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 100% extents: 5 -> 1 [ OK ]
Since I don't have any experience with awk
and similar tools, I wonder:
How to filter out only fragmented files from the log? If even possible...
For specialists: If you could sort it by the most fragmented files, that would be awesome, but not a condition to answer this question.
A line I don't want to see end in:
... extents: 1 -> 1 [ OK ]
Lines I want to see end in:
... extents: 5 -> 1 [ OK ]
... extents: 20 -> 5 [ OK ]
Where I need to show only lines with 5
or whatever number there is on that very place, higher than 1 obviously.
EDIT:
Example output of the verbose defragmentation for you to try the commands on:
https://www.vlastimilburian.cz/public/linux/defrag-2017-11-05.bz2
Just extract it and you're good to go.
logs defragmentation
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
So you only want line with a changing number here:20 -> 5
?
â Michael Daffin
Nov 5 '17 at 9:04
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Supposing I have just finished a defragmentation on ext4
file system on a HDD:
sudo e4defrag -v / > ~/defrag-2017-11-05 2>&1 &
Which is most probably unnecessary, but I wanted to see, which files have been fragmented.
The log looks like:
==> defrag-2017-11-05 <==
ext4 defragmentation for directory(/)
[1/403415] "/"
File is not regular file [ NG ]
[2/403415] "/usr"
File is not regular file [ NG ]
[3/403415] "/usr/share"
File is not regular file [ NG ]
[4/403415] "/usr/share/ppp"
File is not regular file [ NG ]
[5/403415]^[[79;0H^[[K[5/403415]/usr/share/ppp/chap-secrets: 100% extents: 1 -> 1 [ OK ]
[6/403415]^[[79;0H^[[K[6/403415]/usr/share/ppp/provider.chatscript: 100% extents: 1 -> 1 [ OK ]
[7/403415]^[[79;0H^[[K[7/403415]/usr/share/ppp/provider.peer: 100% extents: 1 -> 1 [ OK ]
[8/403415]^[[79;0H^[[K[8/403415]/usr/share/ppp/pap-secrets: 100% extents: 1 -> 1 [ OK ]
[9/403415] "/usr/share/backgrounds"
File is not regular file [ NG ]
[10/403415] "/usr/share/backgrounds/linuxmint-retro"
File is not regular file [ NG ]
[11/403415]^[[79;0H^[[K[11/403415]/usr/share/backgrounds/linuxmint-retro/Gloria.jpg: 100% extents: 1 -> 1 [ OK ]
[12/403415]^[[79;0H^[[K[12/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2441.jpg: 100% extents: 1 -> 1 [ OK ]
[13/403415]^[[79;0H^[[K[13/403415]/usr/share/backgrounds/linuxmint-retro/theaeffect_3.png: 100% extents: 1 -> 1 [ OK ]
[14/403415]^[[79;0H^[[K[14/403415]/usr/share/backgrounds/linuxmint-retro/multigons.jpg: 100% extents: 1 -> 1 [ OK ]
[15/403415]^[[79;0H^[[K[15/403415]/usr/share/backgrounds/linuxmint-retro/Felicia.png: 100% extents: 1 -> 1 [ OK ]
[16/403415]^[[79;0H^[[K[16/403415]/usr/share/backgrounds/linuxmint-retro/LinuxMint.png: 100% extents: 1 -> 1 [ OK ]
[17/403415]^[[79;0H^[[K[17/403415]/usr/share/backgrounds/linuxmint-retro/air.jpg: 100% extents: 1 -> 1 [ OK ]
[18/403415]^[[79;0H^[[K[18/403415]/usr/share/backgrounds/linuxmint-retro/curve.jpg: 100% extents: 1 -> 1 [ OK ]
[19/403415]^[[79;0H^[[K[19/403415]/usr/share/backgrounds/linuxmint-retro/fizzy.jpg: 100% extents: 1 -> 1 [ OK ]
[20/403415]^[[79;0H^[[K[20/403415]/usr/share/backgrounds/linuxmint-retro/silent_green.jpg: 100% extents: 1 -> 1 [ OK ]
[21/403415]^[[79;0H^[[K[21/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2112.jpg: 100% extents: 1 -> 1 [ OK ]
[22/403415]^[[79;0H^[[K[22/403415]/usr/share/backgrounds/linuxmint-retro/Emotion.jpg: 100% extents: 1 -> 1 [ OK ]
[23/403415]^[[79;0H^[[K[23/403415]/usr/share/backgrounds/linuxmint-retro/pr09studio_spring.png: 100% extents: 1 -> 1 [ OK ]
[24/403415]^[[79;0H^[[K[24/403415]/usr/share/backgrounds/linuxmint-retro/Talento-1.jpg: 100% extents: 1 -> 1 [ OK ]
[324150/403415]^[[79;0H^[[K[324150/403415]/home/ruzena/StaM-EM->enM-CM-)/Altitude.2017.DVDRip.XviD.AC3-EVO/Altitude.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[324290/403415]^[[79;0H^[[K[324290/403415]/home/ruzena/StaM-EM->enM-CM-)/Savage.Dog.2017.BRRip.XviD.AC3-EVO/Savage.Dog.2017.BRRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325184/403415]^[[79;0H^[[K[325184/403415]/home/ruzena/StaM-EM->enM-CM-)/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325356/403415]^[[79;0H^[[K[325356/403415]/home/ruzena/StaM-EM->enM-CM-)/Kong.Skull.Island.2017.TS.XviD.AC3-RUSSIAN.avi: 100% extents: 20 -> 20 [ OK ]
[352147/403415]^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 0%^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 100% extents: 5 -> 1 [ OK ]
[352943/403415]^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 0%^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 100% extents: 5 -> 1 [ OK ]
[354676/403415]^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 0%^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 100% extents: 5 -> 1 [ OK ]
[400977/403415]^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 0%^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 100% extents: 5 -> 1 [ OK ]
Since I don't have any experience with awk
and similar tools, I wonder:
How to filter out only fragmented files from the log? If even possible...
For specialists: If you could sort it by the most fragmented files, that would be awesome, but not a condition to answer this question.
A line I don't want to see end in:
... extents: 1 -> 1 [ OK ]
Lines I want to see end in:
... extents: 5 -> 1 [ OK ]
... extents: 20 -> 5 [ OK ]
Where I need to show only lines with 5
or whatever number there is on that very place, higher than 1 obviously.
EDIT:
Example output of the verbose defragmentation for you to try the commands on:
https://www.vlastimilburian.cz/public/linux/defrag-2017-11-05.bz2
Just extract it and you're good to go.
logs defragmentation
Supposing I have just finished a defragmentation on ext4
file system on a HDD:
sudo e4defrag -v / > ~/defrag-2017-11-05 2>&1 &
Which is most probably unnecessary, but I wanted to see, which files have been fragmented.
The log looks like:
==> defrag-2017-11-05 <==
ext4 defragmentation for directory(/)
[1/403415] "/"
File is not regular file [ NG ]
[2/403415] "/usr"
File is not regular file [ NG ]
[3/403415] "/usr/share"
File is not regular file [ NG ]
[4/403415] "/usr/share/ppp"
File is not regular file [ NG ]
[5/403415]^[[79;0H^[[K[5/403415]/usr/share/ppp/chap-secrets: 100% extents: 1 -> 1 [ OK ]
[6/403415]^[[79;0H^[[K[6/403415]/usr/share/ppp/provider.chatscript: 100% extents: 1 -> 1 [ OK ]
[7/403415]^[[79;0H^[[K[7/403415]/usr/share/ppp/provider.peer: 100% extents: 1 -> 1 [ OK ]
[8/403415]^[[79;0H^[[K[8/403415]/usr/share/ppp/pap-secrets: 100% extents: 1 -> 1 [ OK ]
[9/403415] "/usr/share/backgrounds"
File is not regular file [ NG ]
[10/403415] "/usr/share/backgrounds/linuxmint-retro"
File is not regular file [ NG ]
[11/403415]^[[79;0H^[[K[11/403415]/usr/share/backgrounds/linuxmint-retro/Gloria.jpg: 100% extents: 1 -> 1 [ OK ]
[12/403415]^[[79;0H^[[K[12/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2441.jpg: 100% extents: 1 -> 1 [ OK ]
[13/403415]^[[79;0H^[[K[13/403415]/usr/share/backgrounds/linuxmint-retro/theaeffect_3.png: 100% extents: 1 -> 1 [ OK ]
[14/403415]^[[79;0H^[[K[14/403415]/usr/share/backgrounds/linuxmint-retro/multigons.jpg: 100% extents: 1 -> 1 [ OK ]
[15/403415]^[[79;0H^[[K[15/403415]/usr/share/backgrounds/linuxmint-retro/Felicia.png: 100% extents: 1 -> 1 [ OK ]
[16/403415]^[[79;0H^[[K[16/403415]/usr/share/backgrounds/linuxmint-retro/LinuxMint.png: 100% extents: 1 -> 1 [ OK ]
[17/403415]^[[79;0H^[[K[17/403415]/usr/share/backgrounds/linuxmint-retro/air.jpg: 100% extents: 1 -> 1 [ OK ]
[18/403415]^[[79;0H^[[K[18/403415]/usr/share/backgrounds/linuxmint-retro/curve.jpg: 100% extents: 1 -> 1 [ OK ]
[19/403415]^[[79;0H^[[K[19/403415]/usr/share/backgrounds/linuxmint-retro/fizzy.jpg: 100% extents: 1 -> 1 [ OK ]
[20/403415]^[[79;0H^[[K[20/403415]/usr/share/backgrounds/linuxmint-retro/silent_green.jpg: 100% extents: 1 -> 1 [ OK ]
[21/403415]^[[79;0H^[[K[21/403415]/usr/share/backgrounds/linuxmint-retro/aviatorjk_2112.jpg: 100% extents: 1 -> 1 [ OK ]
[22/403415]^[[79;0H^[[K[22/403415]/usr/share/backgrounds/linuxmint-retro/Emotion.jpg: 100% extents: 1 -> 1 [ OK ]
[23/403415]^[[79;0H^[[K[23/403415]/usr/share/backgrounds/linuxmint-retro/pr09studio_spring.png: 100% extents: 1 -> 1 [ OK ]
[24/403415]^[[79;0H^[[K[24/403415]/usr/share/backgrounds/linuxmint-retro/Talento-1.jpg: 100% extents: 1 -> 1 [ OK ]
[324150/403415]^[[79;0H^[[K[324150/403415]/home/ruzena/StaM-EM->enM-CM-)/Altitude.2017.DVDRip.XviD.AC3-EVO/Altitude.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[324290/403415]^[[79;0H^[[K[324290/403415]/home/ruzena/StaM-EM->enM-CM-)/Savage.Dog.2017.BRRip.XviD.AC3-EVO/Savage.Dog.2017.BRRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325184/403415]^[[79;0H^[[K[325184/403415]/home/ruzena/StaM-EM->enM-CM-)/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO/Death.Race.2050.2017.DVDRip.XviD.AC3-EVO.avi: 100% extents: 20 -> 20 [ OK ]
[325356/403415]^[[79;0H^[[K[325356/403415]/home/ruzena/StaM-EM->enM-CM-)/Kong.Skull.Island.2017.TS.XviD.AC3-RUSSIAN.avi: 100% extents: 20 -> 20 [ OK ]
[352147/403415]^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 0%^[[79;0H^[[K[352147/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d9b788060b0d42ce_0: 100% extents: 5 -> 1 [ OK ]
[352943/403415]^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 0%^[[79;0H^[[K[352943/403415]/home/ruzena/.cache/google-chrome/Default/Cache/d7789aeea4cbf251_1: 100% extents: 5 -> 1 [ OK ]
[354676/403415]^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 0%^[[79;0H^[[K[354676/403415]/home/ruzena/.cache/google-chrome/Default/Cache/98b71219db7f9992_1: 100% extents: 5 -> 1 [ OK ]
[400977/403415]^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 0%^[[79;0H^[[K[400977/403415]/home/ruzena/.local/share/zeitgeist/fts.index/postlist.DB: 100% extents: 5 -> 1 [ OK ]
Since I don't have any experience with awk
and similar tools, I wonder:
How to filter out only fragmented files from the log? If even possible...
For specialists: If you could sort it by the most fragmented files, that would be awesome, but not a condition to answer this question.
A line I don't want to see end in:
... extents: 1 -> 1 [ OK ]
Lines I want to see end in:
... extents: 5 -> 1 [ OK ]
... extents: 20 -> 5 [ OK ]
Where I need to show only lines with 5
or whatever number there is on that very place, higher than 1 obviously.
EDIT:
Example output of the verbose defragmentation for you to try the commands on:
https://www.vlastimilburian.cz/public/linux/defrag-2017-11-05.bz2
Just extract it and you're good to go.
logs defragmentation
edited Nov 5 '17 at 11:06
Michael Daffin
2,5801517
2,5801517
asked Nov 5 '17 at 8:18
Vlastimil
6,4711147120
6,4711147120
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
So you only want line with a changing number here:20 -> 5
?
â Michael Daffin
Nov 5 '17 at 9:04
add a comment |Â
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
So you only want line with a changing number here:20 -> 5
?
â Michael Daffin
Nov 5 '17 at 9:04
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
So you only want line with a changing number here:
20 -> 5
?â Michael Daffin
Nov 5 '17 at 9:04
So you only want line with a changing number here:
20 -> 5
?â Michael Daffin
Nov 5 '17 at 9:04
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
accepted
awk ' if ($4!=$6) print $4 - $6" "$0_' ~/defrag-2017-11-05 | sort -g
We use awk to compare the two columns in the input to see if they are not equal and print out the difference in values followed by the whole lines. Then we sort the filtered lines by the differences that we added at the start of the line.
If you want to check for lines with the first number higher than 1 you can use
awk ' if ($4>1) print $0_' ~/defrag-2017-11-05 | sort -gk4
Here we simply sort on the 4th column instead of creating a new difference column.
Edit
To handle spaces in filenames and weird characters at the start of lines and to filter out other lines use
awk '/extents: / sub(/.*]//, "/"); sub(/:/, "", $1); if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1 ' ~/defrag-2017-11-05 | sort -g
Here is the awk script formatted nicely to make it easier to read
/extents: /
sub(/.*]//, "/");
sub(/:/, "", $1);
if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1
- filters out any line that does not have
extents:
in it. - replaces the starting characters up to
]/
with/
to strip the nonsense at the start of the lines. - removes
:
from the filename to make it a bit cleaner - compares the two fields we care about counting from the end of the string and prints out the lines where the two numbers differ with this difference.
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.$(NF-5)
for the number before the->
.
â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
awk ' if ($4!=$6) print $4 - $6" "$0_' ~/defrag-2017-11-05 | sort -g
We use awk to compare the two columns in the input to see if they are not equal and print out the difference in values followed by the whole lines. Then we sort the filtered lines by the differences that we added at the start of the line.
If you want to check for lines with the first number higher than 1 you can use
awk ' if ($4>1) print $0_' ~/defrag-2017-11-05 | sort -gk4
Here we simply sort on the 4th column instead of creating a new difference column.
Edit
To handle spaces in filenames and weird characters at the start of lines and to filter out other lines use
awk '/extents: / sub(/.*]//, "/"); sub(/:/, "", $1); if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1 ' ~/defrag-2017-11-05 | sort -g
Here is the awk script formatted nicely to make it easier to read
/extents: /
sub(/.*]//, "/");
sub(/:/, "", $1);
if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1
- filters out any line that does not have
extents:
in it. - replaces the starting characters up to
]/
with/
to strip the nonsense at the start of the lines. - removes
:
from the filename to make it a bit cleaner - compares the two fields we care about counting from the end of the string and prints out the lines where the two numbers differ with this difference.
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.$(NF-5)
for the number before the->
.
â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
add a comment |Â
up vote
2
down vote
accepted
awk ' if ($4!=$6) print $4 - $6" "$0_' ~/defrag-2017-11-05 | sort -g
We use awk to compare the two columns in the input to see if they are not equal and print out the difference in values followed by the whole lines. Then we sort the filtered lines by the differences that we added at the start of the line.
If you want to check for lines with the first number higher than 1 you can use
awk ' if ($4>1) print $0_' ~/defrag-2017-11-05 | sort -gk4
Here we simply sort on the 4th column instead of creating a new difference column.
Edit
To handle spaces in filenames and weird characters at the start of lines and to filter out other lines use
awk '/extents: / sub(/.*]//, "/"); sub(/:/, "", $1); if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1 ' ~/defrag-2017-11-05 | sort -g
Here is the awk script formatted nicely to make it easier to read
/extents: /
sub(/.*]//, "/");
sub(/:/, "", $1);
if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1
- filters out any line that does not have
extents:
in it. - replaces the starting characters up to
]/
with/
to strip the nonsense at the start of the lines. - removes
:
from the filename to make it a bit cleaner - compares the two fields we care about counting from the end of the string and prints out the lines where the two numbers differ with this difference.
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.$(NF-5)
for the number before the->
.
â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
awk ' if ($4!=$6) print $4 - $6" "$0_' ~/defrag-2017-11-05 | sort -g
We use awk to compare the two columns in the input to see if they are not equal and print out the difference in values followed by the whole lines. Then we sort the filtered lines by the differences that we added at the start of the line.
If you want to check for lines with the first number higher than 1 you can use
awk ' if ($4>1) print $0_' ~/defrag-2017-11-05 | sort -gk4
Here we simply sort on the 4th column instead of creating a new difference column.
Edit
To handle spaces in filenames and weird characters at the start of lines and to filter out other lines use
awk '/extents: / sub(/.*]//, "/"); sub(/:/, "", $1); if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1 ' ~/defrag-2017-11-05 | sort -g
Here is the awk script formatted nicely to make it easier to read
/extents: /
sub(/.*]//, "/");
sub(/:/, "", $1);
if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1
- filters out any line that does not have
extents:
in it. - replaces the starting characters up to
]/
with/
to strip the nonsense at the start of the lines. - removes
:
from the filename to make it a bit cleaner - compares the two fields we care about counting from the end of the string and prints out the lines where the two numbers differ with this difference.
awk ' if ($4!=$6) print $4 - $6" "$0_' ~/defrag-2017-11-05 | sort -g
We use awk to compare the two columns in the input to see if they are not equal and print out the difference in values followed by the whole lines. Then we sort the filtered lines by the differences that we added at the start of the line.
If you want to check for lines with the first number higher than 1 you can use
awk ' if ($4>1) print $0_' ~/defrag-2017-11-05 | sort -gk4
Here we simply sort on the 4th column instead of creating a new difference column.
Edit
To handle spaces in filenames and weird characters at the start of lines and to filter out other lines use
awk '/extents: / sub(/.*]//, "/"); sub(/:/, "", $1); if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1 ' ~/defrag-2017-11-05 | sort -g
Here is the awk script formatted nicely to make it easier to read
/extents: /
sub(/.*]//, "/");
sub(/:/, "", $1);
if ($(NF-5)!=$(NF-3)) print $(NF-5) - $(NF-3)" "$1
- filters out any line that does not have
extents:
in it. - replaces the starting characters up to
]/
with/
to strip the nonsense at the start of the lines. - removes
:
from the filename to make it a bit cleaner - compares the two fields we care about counting from the end of the string and prints out the lines where the two numbers differ with this difference.
edited Nov 5 '17 at 10:57
answered Nov 5 '17 at 9:15
Michael Daffin
2,5801517
2,5801517
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.$(NF-5)
for the number before the->
.
â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
add a comment |Â
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.$(NF-5)
for the number before the->
.
â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
It does not work because there are control character in the output - which will change the regex I used. There are also lines that we do not want in the input which should be filtered out. This is why you need to give full examples when you ask the question.
â Michael Daffin
Nov 5 '17 at 10:22
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
So you did not do the file redirection like in your question? Most programs stripe color control codes when you redirect the output. Which again, will change the answer to the question.
â Michael Daffin
Nov 5 '17 at 10:26
1
1
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.
$(NF-5)
for the number before the ->
.â meuh
Nov 5 '17 at 10:49
@MichaelDaffin Note with awk you can also count back from the last field to get the field you want, to avoid complications when there may be spaces in the input line. E.g.
$(NF-5)
for the number before the ->
.â meuh
Nov 5 '17 at 10:49
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
@meuh that was helpful, thanks
â Michael Daffin
Nov 5 '17 at 10:58
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f402612%2fhow-to-filter-out-only-fragmented-files-from-the-log%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Can you give an example of a line you want and a line you dont want?
â Michael Daffin
Nov 5 '17 at 8:49
So you only want line with a changing number here:
20 -> 5
?â Michael Daffin
Nov 5 '17 at 9:04