resume reading a log file from the point I left it last time [duplicate]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite













This question already has an answer here:



  • Display lines in last 10 minutes with specific pattern in logs

    3 answers



I have a log file which is continuously updating(new line added) after few time period.



I am fetching only error messages from the file in every 10 minutes.



Initially, at 1st time I fetched all line into a new file with a matching pattern "ERROR FOUND" using awk.



But after 10 min more new line has been added to a log file, so I want to read that log file where I left. I don't want to start from the beginning again.



Can any body suggest me best code or script for this?










share|improve this question















marked as duplicate by msp9011, DarkHeart, countermode, schily, Thomas Aug 29 at 17:31


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
    – Vipin Sahu
    Aug 29 at 7:08















up vote
0
down vote

favorite













This question already has an answer here:



  • Display lines in last 10 minutes with specific pattern in logs

    3 answers



I have a log file which is continuously updating(new line added) after few time period.



I am fetching only error messages from the file in every 10 minutes.



Initially, at 1st time I fetched all line into a new file with a matching pattern "ERROR FOUND" using awk.



But after 10 min more new line has been added to a log file, so I want to read that log file where I left. I don't want to start from the beginning again.



Can any body suggest me best code or script for this?










share|improve this question















marked as duplicate by msp9011, DarkHeart, countermode, schily, Thomas Aug 29 at 17:31


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
    – Vipin Sahu
    Aug 29 at 7:08













up vote
0
down vote

favorite









up vote
0
down vote

favorite












This question already has an answer here:



  • Display lines in last 10 minutes with specific pattern in logs

    3 answers



I have a log file which is continuously updating(new line added) after few time period.



I am fetching only error messages from the file in every 10 minutes.



Initially, at 1st time I fetched all line into a new file with a matching pattern "ERROR FOUND" using awk.



But after 10 min more new line has been added to a log file, so I want to read that log file where I left. I don't want to start from the beginning again.



Can any body suggest me best code or script for this?










share|improve this question
















This question already has an answer here:



  • Display lines in last 10 minutes with specific pattern in logs

    3 answers



I have a log file which is continuously updating(new line added) after few time period.



I am fetching only error messages from the file in every 10 minutes.



Initially, at 1st time I fetched all line into a new file with a matching pattern "ERROR FOUND" using awk.



But after 10 min more new line has been added to a log file, so I want to read that log file where I left. I don't want to start from the beginning again.



Can any body suggest me best code or script for this?





This question already has an answer here:



  • Display lines in last 10 minutes with specific pattern in logs

    3 answers







awk scripting vi gawk






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Aug 29 at 15:01









Stéphane Chazelas

286k53527866




286k53527866










asked Aug 29 at 6:53









Vipin Sahu

31




31




marked as duplicate by msp9011, DarkHeart, countermode, schily, Thomas Aug 29 at 17:31


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






marked as duplicate by msp9011, DarkHeart, countermode, schily, Thomas Aug 29 at 17:31


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.













  • Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
    – Vipin Sahu
    Aug 29 at 7:08

















  • Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
    – Vipin Sahu
    Aug 29 at 7:08
















Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
– Vipin Sahu
Aug 29 at 7:08





Yes it is back end system file which updates in every second. In this I'm trying to store last line no. also in another file. SO i Can start reading file again after that line no. But some how I'm not able to querying it.
– Vipin Sahu
Aug 29 at 7:08











3 Answers
3






active

oldest

votes

















up vote
2
down vote













If you open the file on a file descriptor like:



exec 3< /path/to/log/file


You can then process it:



awk '...' <&3


After which fd 3 will point to where awk left it.



10 minutes later, from the same shell invocation, you can run that



awk '...' <&3


command again to process the new data.



If you want to save the position you were at, so you can resume reading from a different shell invocation, with ksh93, you can do:



#! /usr/bin/env ksh93
file=/path/to/some-file
offset_file=$file.offset

exec 3< "$file"
[ -f "$offset_file" ] && exec 3<#(($(<"$offset_file")))

awk '...' <&3

echo "$(3<#((CUR)))" > "$offset_file"


Or with zsh:



#! /usr/bin/env zsh

zmodload zsh/system
file=/path/to/some-file
offset_file=$file.offset

exec 3< $file
[ -f "$offset_file" ] && sysseek -u 3 "$(<$offset_file)"

awk '...' <&3

echo $((systell(3))) > $offset_file





share|improve this answer





























    up vote
    1
    down vote













    I like Stéphane's answer because it doesn't read the whole file again and again, so I add here the bash (on Linux) equivalent of his solution (bash has no builtin seek or tell ability). I would have used a comment but my reputation is too low.



    LASTPOS=/tmp/saved_pos

    exec 3< "$1"
    test -f "$LASTPOS" && STARTPOS=$(($(<$LASTPOS)+1))
    tail -c "+$STARTPOS:-1" <&3 | grep "ERROR FOUND"
    grep '^pos:' /proc/self/fdinfo/3 | cut -f2 > "$LASTPOS"


    I also replaced the awk command with a grep because it is usually faster. You can pipe the output to a awk command if you need further processing.






    share|improve this answer





























      up vote
      0
      down vote













      I would give a try with wc -l and tail.

      If you are using bash, this should work:



      #!/bin/bash
      LASTLNFILE=/tmp/lastline # replace with a suitable path
      test -f $LASTLNFILE && LASTLN=$(<$LASTLNFILE)
      CURLN=$(wc -l $1 | cut -d' ' -f1)

      if ((CURLN-LASTLN > 0)); then
      tail -n $((CURLN-LASTLN)) $1
      fi
      echo $CURLN > $LASTLNFILE


      P.S. use it as a filter before your awk program, e.g. (assuming you named it 'newlines.sh'):



      ./newlines.sh <log_file> | awk -f <your_awk_program>`



      I am leaving the above script as an example of how to not do it. Just after writing it I realized it is vulnerable to a race condition, whenever the log file is updated while the script is running.



      A pure AWK approach is preferable:



      #!/bin/awk

      BEGIN
      lastlinefile = "/tmp/lastlinefile"
      getline lastline < lastlinefile


      NR > lastline && /ERROR FOUND/
      # do your stuff...
      print


      END print NR > lastlinefile





      share|improve this answer





























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes








        up vote
        2
        down vote













        If you open the file on a file descriptor like:



        exec 3< /path/to/log/file


        You can then process it:



        awk '...' <&3


        After which fd 3 will point to where awk left it.



        10 minutes later, from the same shell invocation, you can run that



        awk '...' <&3


        command again to process the new data.



        If you want to save the position you were at, so you can resume reading from a different shell invocation, with ksh93, you can do:



        #! /usr/bin/env ksh93
        file=/path/to/some-file
        offset_file=$file.offset

        exec 3< "$file"
        [ -f "$offset_file" ] && exec 3<#(($(<"$offset_file")))

        awk '...' <&3

        echo "$(3<#((CUR)))" > "$offset_file"


        Or with zsh:



        #! /usr/bin/env zsh

        zmodload zsh/system
        file=/path/to/some-file
        offset_file=$file.offset

        exec 3< $file
        [ -f "$offset_file" ] && sysseek -u 3 "$(<$offset_file)"

        awk '...' <&3

        echo $((systell(3))) > $offset_file





        share|improve this answer


























          up vote
          2
          down vote













          If you open the file on a file descriptor like:



          exec 3< /path/to/log/file


          You can then process it:



          awk '...' <&3


          After which fd 3 will point to where awk left it.



          10 minutes later, from the same shell invocation, you can run that



          awk '...' <&3


          command again to process the new data.



          If you want to save the position you were at, so you can resume reading from a different shell invocation, with ksh93, you can do:



          #! /usr/bin/env ksh93
          file=/path/to/some-file
          offset_file=$file.offset

          exec 3< "$file"
          [ -f "$offset_file" ] && exec 3<#(($(<"$offset_file")))

          awk '...' <&3

          echo "$(3<#((CUR)))" > "$offset_file"


          Or with zsh:



          #! /usr/bin/env zsh

          zmodload zsh/system
          file=/path/to/some-file
          offset_file=$file.offset

          exec 3< $file
          [ -f "$offset_file" ] && sysseek -u 3 "$(<$offset_file)"

          awk '...' <&3

          echo $((systell(3))) > $offset_file





          share|improve this answer
























            up vote
            2
            down vote










            up vote
            2
            down vote









            If you open the file on a file descriptor like:



            exec 3< /path/to/log/file


            You can then process it:



            awk '...' <&3


            After which fd 3 will point to where awk left it.



            10 minutes later, from the same shell invocation, you can run that



            awk '...' <&3


            command again to process the new data.



            If you want to save the position you were at, so you can resume reading from a different shell invocation, with ksh93, you can do:



            #! /usr/bin/env ksh93
            file=/path/to/some-file
            offset_file=$file.offset

            exec 3< "$file"
            [ -f "$offset_file" ] && exec 3<#(($(<"$offset_file")))

            awk '...' <&3

            echo "$(3<#((CUR)))" > "$offset_file"


            Or with zsh:



            #! /usr/bin/env zsh

            zmodload zsh/system
            file=/path/to/some-file
            offset_file=$file.offset

            exec 3< $file
            [ -f "$offset_file" ] && sysseek -u 3 "$(<$offset_file)"

            awk '...' <&3

            echo $((systell(3))) > $offset_file





            share|improve this answer














            If you open the file on a file descriptor like:



            exec 3< /path/to/log/file


            You can then process it:



            awk '...' <&3


            After which fd 3 will point to where awk left it.



            10 minutes later, from the same shell invocation, you can run that



            awk '...' <&3


            command again to process the new data.



            If you want to save the position you were at, so you can resume reading from a different shell invocation, with ksh93, you can do:



            #! /usr/bin/env ksh93
            file=/path/to/some-file
            offset_file=$file.offset

            exec 3< "$file"
            [ -f "$offset_file" ] && exec 3<#(($(<"$offset_file")))

            awk '...' <&3

            echo "$(3<#((CUR)))" > "$offset_file"


            Or with zsh:



            #! /usr/bin/env zsh

            zmodload zsh/system
            file=/path/to/some-file
            offset_file=$file.offset

            exec 3< $file
            [ -f "$offset_file" ] && sysseek -u 3 "$(<$offset_file)"

            awk '...' <&3

            echo $((systell(3))) > $offset_file






            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Aug 29 at 9:44

























            answered Aug 29 at 9:36









            Stéphane Chazelas

            286k53527866




            286k53527866






















                up vote
                1
                down vote













                I like Stéphane's answer because it doesn't read the whole file again and again, so I add here the bash (on Linux) equivalent of his solution (bash has no builtin seek or tell ability). I would have used a comment but my reputation is too low.



                LASTPOS=/tmp/saved_pos

                exec 3< "$1"
                test -f "$LASTPOS" && STARTPOS=$(($(<$LASTPOS)+1))
                tail -c "+$STARTPOS:-1" <&3 | grep "ERROR FOUND"
                grep '^pos:' /proc/self/fdinfo/3 | cut -f2 > "$LASTPOS"


                I also replaced the awk command with a grep because it is usually faster. You can pipe the output to a awk command if you need further processing.






                share|improve this answer


























                  up vote
                  1
                  down vote













                  I like Stéphane's answer because it doesn't read the whole file again and again, so I add here the bash (on Linux) equivalent of his solution (bash has no builtin seek or tell ability). I would have used a comment but my reputation is too low.



                  LASTPOS=/tmp/saved_pos

                  exec 3< "$1"
                  test -f "$LASTPOS" && STARTPOS=$(($(<$LASTPOS)+1))
                  tail -c "+$STARTPOS:-1" <&3 | grep "ERROR FOUND"
                  grep '^pos:' /proc/self/fdinfo/3 | cut -f2 > "$LASTPOS"


                  I also replaced the awk command with a grep because it is usually faster. You can pipe the output to a awk command if you need further processing.






                  share|improve this answer
























                    up vote
                    1
                    down vote










                    up vote
                    1
                    down vote









                    I like Stéphane's answer because it doesn't read the whole file again and again, so I add here the bash (on Linux) equivalent of his solution (bash has no builtin seek or tell ability). I would have used a comment but my reputation is too low.



                    LASTPOS=/tmp/saved_pos

                    exec 3< "$1"
                    test -f "$LASTPOS" && STARTPOS=$(($(<$LASTPOS)+1))
                    tail -c "+$STARTPOS:-1" <&3 | grep "ERROR FOUND"
                    grep '^pos:' /proc/self/fdinfo/3 | cut -f2 > "$LASTPOS"


                    I also replaced the awk command with a grep because it is usually faster. You can pipe the output to a awk command if you need further processing.






                    share|improve this answer














                    I like Stéphane's answer because it doesn't read the whole file again and again, so I add here the bash (on Linux) equivalent of his solution (bash has no builtin seek or tell ability). I would have used a comment but my reputation is too low.



                    LASTPOS=/tmp/saved_pos

                    exec 3< "$1"
                    test -f "$LASTPOS" && STARTPOS=$(($(<$LASTPOS)+1))
                    tail -c "+$STARTPOS:-1" <&3 | grep "ERROR FOUND"
                    grep '^pos:' /proc/self/fdinfo/3 | cut -f2 > "$LASTPOS"


                    I also replaced the awk command with a grep because it is usually faster. You can pipe the output to a awk command if you need further processing.







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Aug 29 at 15:12









                    Stéphane Chazelas

                    286k53527866




                    286k53527866










                    answered Aug 29 at 14:27









                    David

                    213




                    213




















                        up vote
                        0
                        down vote













                        I would give a try with wc -l and tail.

                        If you are using bash, this should work:



                        #!/bin/bash
                        LASTLNFILE=/tmp/lastline # replace with a suitable path
                        test -f $LASTLNFILE && LASTLN=$(<$LASTLNFILE)
                        CURLN=$(wc -l $1 | cut -d' ' -f1)

                        if ((CURLN-LASTLN > 0)); then
                        tail -n $((CURLN-LASTLN)) $1
                        fi
                        echo $CURLN > $LASTLNFILE


                        P.S. use it as a filter before your awk program, e.g. (assuming you named it 'newlines.sh'):



                        ./newlines.sh <log_file> | awk -f <your_awk_program>`



                        I am leaving the above script as an example of how to not do it. Just after writing it I realized it is vulnerable to a race condition, whenever the log file is updated while the script is running.



                        A pure AWK approach is preferable:



                        #!/bin/awk

                        BEGIN
                        lastlinefile = "/tmp/lastlinefile"
                        getline lastline < lastlinefile


                        NR > lastline && /ERROR FOUND/
                        # do your stuff...
                        print


                        END print NR > lastlinefile





                        share|improve this answer


























                          up vote
                          0
                          down vote













                          I would give a try with wc -l and tail.

                          If you are using bash, this should work:



                          #!/bin/bash
                          LASTLNFILE=/tmp/lastline # replace with a suitable path
                          test -f $LASTLNFILE && LASTLN=$(<$LASTLNFILE)
                          CURLN=$(wc -l $1 | cut -d' ' -f1)

                          if ((CURLN-LASTLN > 0)); then
                          tail -n $((CURLN-LASTLN)) $1
                          fi
                          echo $CURLN > $LASTLNFILE


                          P.S. use it as a filter before your awk program, e.g. (assuming you named it 'newlines.sh'):



                          ./newlines.sh <log_file> | awk -f <your_awk_program>`



                          I am leaving the above script as an example of how to not do it. Just after writing it I realized it is vulnerable to a race condition, whenever the log file is updated while the script is running.



                          A pure AWK approach is preferable:



                          #!/bin/awk

                          BEGIN
                          lastlinefile = "/tmp/lastlinefile"
                          getline lastline < lastlinefile


                          NR > lastline && /ERROR FOUND/
                          # do your stuff...
                          print


                          END print NR > lastlinefile





                          share|improve this answer
























                            up vote
                            0
                            down vote










                            up vote
                            0
                            down vote









                            I would give a try with wc -l and tail.

                            If you are using bash, this should work:



                            #!/bin/bash
                            LASTLNFILE=/tmp/lastline # replace with a suitable path
                            test -f $LASTLNFILE && LASTLN=$(<$LASTLNFILE)
                            CURLN=$(wc -l $1 | cut -d' ' -f1)

                            if ((CURLN-LASTLN > 0)); then
                            tail -n $((CURLN-LASTLN)) $1
                            fi
                            echo $CURLN > $LASTLNFILE


                            P.S. use it as a filter before your awk program, e.g. (assuming you named it 'newlines.sh'):



                            ./newlines.sh <log_file> | awk -f <your_awk_program>`



                            I am leaving the above script as an example of how to not do it. Just after writing it I realized it is vulnerable to a race condition, whenever the log file is updated while the script is running.



                            A pure AWK approach is preferable:



                            #!/bin/awk

                            BEGIN
                            lastlinefile = "/tmp/lastlinefile"
                            getline lastline < lastlinefile


                            NR > lastline && /ERROR FOUND/
                            # do your stuff...
                            print


                            END print NR > lastlinefile





                            share|improve this answer














                            I would give a try with wc -l and tail.

                            If you are using bash, this should work:



                            #!/bin/bash
                            LASTLNFILE=/tmp/lastline # replace with a suitable path
                            test -f $LASTLNFILE && LASTLN=$(<$LASTLNFILE)
                            CURLN=$(wc -l $1 | cut -d' ' -f1)

                            if ((CURLN-LASTLN > 0)); then
                            tail -n $((CURLN-LASTLN)) $1
                            fi
                            echo $CURLN > $LASTLNFILE


                            P.S. use it as a filter before your awk program, e.g. (assuming you named it 'newlines.sh'):



                            ./newlines.sh <log_file> | awk -f <your_awk_program>`



                            I am leaving the above script as an example of how to not do it. Just after writing it I realized it is vulnerable to a race condition, whenever the log file is updated while the script is running.



                            A pure AWK approach is preferable:



                            #!/bin/awk

                            BEGIN
                            lastlinefile = "/tmp/lastlinefile"
                            getline lastline < lastlinefile


                            NR > lastline && /ERROR FOUND/
                            # do your stuff...
                            print


                            END print NR > lastlinefile






                            share|improve this answer














                            share|improve this answer



                            share|improve this answer








                            edited Aug 29 at 9:32

























                            answered Aug 29 at 8:00









                            David

                            213




                            213












                                Popular posts from this blog

                                How to check contact read email or not when send email to Individual?

                                Displaying single band from multi-band raster using QGIS

                                How many registers does an x86_64 CPU actually have?