How to extract logs between the current time and the last 15 minutes

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I want to extract the logs between the current time stamp and 15 minutes before and sent an email to the people configured. I developed the below script but it's not working properly; can someone help me?? I have a log file containing this pattern:



[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.513-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.515-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.516-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger Listener QuartzTriggerListener.triggerFired(Trigger trigger, JobExecutionContext ctx)
[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-7] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21956] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Candidate with Job History Id:1336814[[
org.identityconnectors.framework.common.exceptions.ConfigurationException: Directory does not contain normal files to read HR-76
at org.identityconnectors.flatfile.utils.FlatFileUtil.assertValidFilesinDir(FlatFileUtil.java:230)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getDir(FlatFileUtil.java:176)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getFlatFileDir(FlatFileUtil.java:182)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:134)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:58)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.rawSearch(SearchImpl.java:105)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.search(SearchImpl.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ConnectorAPIOperationRunnerProxy.invoke(ConnectorAPIOperationRunnerProxy.java:93)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ThreadClassLoaderManagerProxy.invoke(ThreadClassLoaderManagerProxy.java:107)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.BufferedResultsProxy$BufferedResultsHandler.run(BufferedResultsProxy.java:162)



The script I have written counts the errors found and stores them in a file with the number; if the error count increases it will run the script and send a mail. I can configure cron for this but the script I have written is not working fine. Can someone help me to extract logs between the current time and the last 15 minutes and generate a temp file?




LogDir=/data/app/Oracle/Middleware/user_projects/domains/oim_domain/servers/oim_server1/logs
EMAIL1=xxx@gmail.com
SUBJECT=Failed
MESSAGE="Scheduler failed"
SMTP="SMTPHOSTNAME"
SENDER=no-reply@gmail.com



NOW=$(date +"%FT%T.000%-04:00")
T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")
OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M).log


find $LogDir -mmin -15 -name "oim_server1-diagnostic.log" > files.txt

count=0;
if [ -f lastCount ]; then
count=$(cat lastCount)
fi


while read file
do
echo "reading file n " $file
currentCount=$(grep -c 'Directory does not contain normal files to read HR-76' $file)
if [ $currentCount -ne $count -a $currentCount -ne 0 ];then
echo "Error Found " $currentCount
awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
test -s $OUT &&
echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
rm -f "$OUT"
fi
echo $currentCount > lastCount
done < files.txt


This script is extracting the logs but not in the appropriate format. The largest log which i am finding with



(grep -c 'Directory does not contain normal files to read HR-76' $file)


I want to extract all logs between two timestamps. Some lines may not have the timestamp, but I want those lines also. In short, I want every line that falls under two time stamps. This script is giving me log file only have timestamp and rest all the lines are missing any suggestion ??? Please note the start time stamp or end time stamp may not be there in all lines of the log, but I want every line between these two time stamps.
Sample generation of above log mentioned:::



[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-6] [userId: oiminternal] [ecid: 0000LIt5i3n4yk4LzU^AyW1NEPxf000002,1:23444] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Employee with Job History Id:46608[[ 









share|improve this question























  • Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
    – Peter David Carter
    May 24 '16 at 10:49











  • grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
    – Sobrique
    May 24 '16 at 11:13










  • Now I understand. I think the solution is very simple. See my answer.
    – Otheus
    May 25 '16 at 13:15














up vote
0
down vote

favorite












I want to extract the logs between the current time stamp and 15 minutes before and sent an email to the people configured. I developed the below script but it's not working properly; can someone help me?? I have a log file containing this pattern:



[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.513-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.515-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.516-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger Listener QuartzTriggerListener.triggerFired(Trigger trigger, JobExecutionContext ctx)
[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-7] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21956] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Candidate with Job History Id:1336814[[
org.identityconnectors.framework.common.exceptions.ConfigurationException: Directory does not contain normal files to read HR-76
at org.identityconnectors.flatfile.utils.FlatFileUtil.assertValidFilesinDir(FlatFileUtil.java:230)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getDir(FlatFileUtil.java:176)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getFlatFileDir(FlatFileUtil.java:182)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:134)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:58)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.rawSearch(SearchImpl.java:105)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.search(SearchImpl.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ConnectorAPIOperationRunnerProxy.invoke(ConnectorAPIOperationRunnerProxy.java:93)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ThreadClassLoaderManagerProxy.invoke(ThreadClassLoaderManagerProxy.java:107)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.BufferedResultsProxy$BufferedResultsHandler.run(BufferedResultsProxy.java:162)



The script I have written counts the errors found and stores them in a file with the number; if the error count increases it will run the script and send a mail. I can configure cron for this but the script I have written is not working fine. Can someone help me to extract logs between the current time and the last 15 minutes and generate a temp file?




LogDir=/data/app/Oracle/Middleware/user_projects/domains/oim_domain/servers/oim_server1/logs
EMAIL1=xxx@gmail.com
SUBJECT=Failed
MESSAGE="Scheduler failed"
SMTP="SMTPHOSTNAME"
SENDER=no-reply@gmail.com



NOW=$(date +"%FT%T.000%-04:00")
T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")
OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M).log


find $LogDir -mmin -15 -name "oim_server1-diagnostic.log" > files.txt

count=0;
if [ -f lastCount ]; then
count=$(cat lastCount)
fi


while read file
do
echo "reading file n " $file
currentCount=$(grep -c 'Directory does not contain normal files to read HR-76' $file)
if [ $currentCount -ne $count -a $currentCount -ne 0 ];then
echo "Error Found " $currentCount
awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
test -s $OUT &&
echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
rm -f "$OUT"
fi
echo $currentCount > lastCount
done < files.txt


This script is extracting the logs but not in the appropriate format. The largest log which i am finding with



(grep -c 'Directory does not contain normal files to read HR-76' $file)


I want to extract all logs between two timestamps. Some lines may not have the timestamp, but I want those lines also. In short, I want every line that falls under two time stamps. This script is giving me log file only have timestamp and rest all the lines are missing any suggestion ??? Please note the start time stamp or end time stamp may not be there in all lines of the log, but I want every line between these two time stamps.
Sample generation of above log mentioned:::



[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-6] [userId: oiminternal] [ecid: 0000LIt5i3n4yk4LzU^AyW1NEPxf000002,1:23444] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Employee with Job History Id:46608[[ 









share|improve this question























  • Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
    – Peter David Carter
    May 24 '16 at 10:49











  • grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
    – Sobrique
    May 24 '16 at 11:13










  • Now I understand. I think the solution is very simple. See my answer.
    – Otheus
    May 25 '16 at 13:15












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I want to extract the logs between the current time stamp and 15 minutes before and sent an email to the people configured. I developed the below script but it's not working properly; can someone help me?? I have a log file containing this pattern:



[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.513-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.515-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.516-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger Listener QuartzTriggerListener.triggerFired(Trigger trigger, JobExecutionContext ctx)
[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-7] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21956] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Candidate with Job History Id:1336814[[
org.identityconnectors.framework.common.exceptions.ConfigurationException: Directory does not contain normal files to read HR-76
at org.identityconnectors.flatfile.utils.FlatFileUtil.assertValidFilesinDir(FlatFileUtil.java:230)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getDir(FlatFileUtil.java:176)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getFlatFileDir(FlatFileUtil.java:182)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:134)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:58)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.rawSearch(SearchImpl.java:105)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.search(SearchImpl.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ConnectorAPIOperationRunnerProxy.invoke(ConnectorAPIOperationRunnerProxy.java:93)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ThreadClassLoaderManagerProxy.invoke(ThreadClassLoaderManagerProxy.java:107)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.BufferedResultsProxy$BufferedResultsHandler.run(BufferedResultsProxy.java:162)



The script I have written counts the errors found and stores them in a file with the number; if the error count increases it will run the script and send a mail. I can configure cron for this but the script I have written is not working fine. Can someone help me to extract logs between the current time and the last 15 minutes and generate a temp file?




LogDir=/data/app/Oracle/Middleware/user_projects/domains/oim_domain/servers/oim_server1/logs
EMAIL1=xxx@gmail.com
SUBJECT=Failed
MESSAGE="Scheduler failed"
SMTP="SMTPHOSTNAME"
SENDER=no-reply@gmail.com



NOW=$(date +"%FT%T.000%-04:00")
T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")
OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M).log


find $LogDir -mmin -15 -name "oim_server1-diagnostic.log" > files.txt

count=0;
if [ -f lastCount ]; then
count=$(cat lastCount)
fi


while read file
do
echo "reading file n " $file
currentCount=$(grep -c 'Directory does not contain normal files to read HR-76' $file)
if [ $currentCount -ne $count -a $currentCount -ne 0 ];then
echo "Error Found " $currentCount
awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
test -s $OUT &&
echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
rm -f "$OUT"
fi
echo $currentCount > lastCount
done < files.txt


This script is extracting the logs but not in the appropriate format. The largest log which i am finding with



(grep -c 'Directory does not contain normal files to read HR-76' $file)


I want to extract all logs between two timestamps. Some lines may not have the timestamp, but I want those lines also. In short, I want every line that falls under two time stamps. This script is giving me log file only have timestamp and rest all the lines are missing any suggestion ??? Please note the start time stamp or end time stamp may not be there in all lines of the log, but I want every line between these two time stamps.
Sample generation of above log mentioned:::



[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-6] [userId: oiminternal] [ecid: 0000LIt5i3n4yk4LzU^AyW1NEPxf000002,1:23444] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Employee with Job History Id:46608[[ 









share|improve this question















I want to extract the logs between the current time stamp and 15 minutes before and sent an email to the people configured. I developed the below script but it's not working properly; can someone help me?? I have a log file containing this pattern:



[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.508-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_QuartzSchedulerThread] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21904] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzJob] [SRC_METHOD: <init>] Constructor QuartzJob
[2016-05-24T00:58:04.513-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.515-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger state 0
[2016-05-24T00:58:04.516-04:00] [oim_server1] [TRACE:32] [oracle.iam.scheduler.impl.quartz] [tid: OIMQuartzScheduler_Worker-1] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21908] [APP: oim#11.1.2.0.0] [SRC_CLASS: oracle.iam.scheduler.impl.quartz.QuartzTriggerListener] [SRC_METHOD: triggerFired] Trigger Listener QuartzTriggerListener.triggerFired(Trigger trigger, JobExecutionContext ctx)
[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-7] [userId: oiminternal] [ecid: 0000LI6NBsP4yk4LzUS4yW1NBABd000003,1:21956] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Candidate with Job History Id:1336814[[
org.identityconnectors.framework.common.exceptions.ConfigurationException: Directory does not contain normal files to read HR-76
at org.identityconnectors.flatfile.utils.FlatFileUtil.assertValidFilesinDir(FlatFileUtil.java:230)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getDir(FlatFileUtil.java:176)
at org.identityconnectors.flatfile.utils.FlatFileUtil.getFlatFileDir(FlatFileUtil.java:182)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:134)
at org.identityconnectors.flatfile.FlatFileConnector.executeQuery(FlatFileConnector.java:58)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.rawSearch(SearchImpl.java:105)
at org.identityconnectors.framework.impl.api.local.operations.SearchImpl.search(SearchImpl.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ConnectorAPIOperationRunnerProxy.invoke(ConnectorAPIOperationRunnerProxy.java:93)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.local.operations.ThreadClassLoaderManagerProxy.invoke(ThreadClassLoaderManagerProxy.java:107)
at com.sun.proxy.$Proxy735.search(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.identityconnectors.framework.impl.api.BufferedResultsProxy$BufferedResultsHandler.run(BufferedResultsProxy.java:162)



The script I have written counts the errors found and stores them in a file with the number; if the error count increases it will run the script and send a mail. I can configure cron for this but the script I have written is not working fine. Can someone help me to extract logs between the current time and the last 15 minutes and generate a temp file?




LogDir=/data/app/Oracle/Middleware/user_projects/domains/oim_domain/servers/oim_server1/logs
EMAIL1=xxx@gmail.com
SUBJECT=Failed
MESSAGE="Scheduler failed"
SMTP="SMTPHOSTNAME"
SENDER=no-reply@gmail.com



NOW=$(date +"%FT%T.000%-04:00")
T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")
OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M).log


find $LogDir -mmin -15 -name "oim_server1-diagnostic.log" > files.txt

count=0;
if [ -f lastCount ]; then
count=$(cat lastCount)
fi


while read file
do
echo "reading file n " $file
currentCount=$(grep -c 'Directory does not contain normal files to read HR-76' $file)
if [ $currentCount -ne $count -a $currentCount -ne 0 ];then
echo "Error Found " $currentCount
awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
test -s $OUT &&
echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
rm -f "$OUT"
fi
echo $currentCount > lastCount
done < files.txt


This script is extracting the logs but not in the appropriate format. The largest log which i am finding with



(grep -c 'Directory does not contain normal files to read HR-76' $file)


I want to extract all logs between two timestamps. Some lines may not have the timestamp, but I want those lines also. In short, I want every line that falls under two time stamps. This script is giving me log file only have timestamp and rest all the lines are missing any suggestion ??? Please note the start time stamp or end time stamp may not be there in all lines of the log, but I want every line between these two time stamps.
Sample generation of above log mentioned:::



[2016-05-24T01:00:04.513-04:00] [oim_server1] [WARNING] [oracle.iam.scheduler.vo] [tid: OIMQuartzScheduler_Worker-6] [userId: oiminternal] [ecid: 0000LIt5i3n4yk4LzU^AyW1NEPxf000002,1:23444] [APP: oim#11.1.2.0.0] IAM-1020021 Unable to execute job : CmyAccess Flat File WD Employee with Job History Id:46608[[ 






text-processing awk sed grep






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited May 25 '16 at 13:24









Otheus

3,207730




3,207730










asked May 24 '16 at 9:56









Puneet Khullar

64




64











  • Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
    – Peter David Carter
    May 24 '16 at 10:49











  • grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
    – Sobrique
    May 24 '16 at 11:13










  • Now I understand. I think the solution is very simple. See my answer.
    – Otheus
    May 25 '16 at 13:15
















  • Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
    – Peter David Carter
    May 24 '16 at 10:49











  • grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
    – Sobrique
    May 24 '16 at 11:13










  • Now I understand. I think the solution is very simple. See my answer.
    – Otheus
    May 25 '16 at 13:15















Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
– Peter David Carter
May 24 '16 at 10:49





Hi Puneet and welcome to Unix & Linux Stack Exchange. When posting, instead of using <p> to format your code, instead select the sections of the question where you've written the code and press Ctrl+K. It will end up much more readable and your question is much more likely to get answered.
– Peter David Carter
May 24 '16 at 10:49













grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
– Sobrique
May 24 '16 at 11:13




grep is not a good way of matching timestamps, as it doesn't understand the 'value' of a time field. I would suggest instead that you need to parse the date, and filter that.
– Sobrique
May 24 '16 at 11:13












Now I understand. I think the solution is very simple. See my answer.
– Otheus
May 25 '16 at 13:15




Now I understand. I think the solution is very simple. See my answer.
– Otheus
May 25 '16 at 13:15










3 Answers
3






active

oldest

votes

















up vote
0
down vote













Regular expressions aren't a good choice for matching timestamps, as they don't really 'understand' numeric values. So instead, I'd suggest parsing the timestamp:



#!/usr/bin/env perl
use strict;
use warnings;

use Time::Piece;

my $now = time();
my $last = $now - 15 * 60;

while ( <> )
my ( $timecode ) = m/[([^.]+)/;
print $timecode;
my $t = Time::Piece -> strptime ( $timecode, '%Y-%m-%dT%H:%M:%S' );
print if $t > $last;



This can one-liner as:



perl -MTime::Piece -ne 'print if Time::Piece -> strptime ( m/[([^.]+)/, '%Y-%m-%dT%H:%M:%S' ) > time() - 15 * 60'





share|improve this answer



























    up vote
    0
    down vote













    I'd use:



    awk -v limit="$(date -d '15 minutes ago' +'[%FT%T')" '
    $0 >= limit' < log-file


    That ignores the potential problems you may have two hours per year when the GMT offsets goes from -04:00 to -05:00 and back if daylight saving applies in your timezone.



    date -d is GNU specific, but you're using it already.






    share|improve this answer



























      up vote
      0
      down vote













      You almost have it.



      Step1



      On GNU and Linux and perhaps other systems, the date command can be used to print out an arbitrary format for a time specification given a user-friendly time expression. For instance, I can get a string representing the time from 15 minutes in the past using this:



       date --date='15 minutes ago'


      You've pretty much done this with your code, albeit less efficiently. But you left out the microseconds. Presumably you don't actually care about them, but you do have to match them. The timezone could be done with '%:z' but presumably, you need to match the existing timezone; things might break shortly after daylight savings time switches. If you have to worry about multiple timezones, you'll need a regex or the Sobrique's solution. Caveat emptor, you could probably get away with:



      NOW=$(date +"%FT%T.000%-04:00")
      T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")


      Step 2



      You can use your NOW and T2 as inputs to awk. String-based matching will work just fine here, but awk lets you make sure that every line fits within the required time range by doing greater-than and less-than string compares.



      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND'


      Step 3 (NEW)



      You have to get the untimestamped log lines in between such timestamped lines. So we use the above awk code to match timestamp lines and when the timestamp is in the right range, set a flag. Only when the flag is set to 1, the current line is printed.



      awk -v TSTART="[$T2]" -v TEND="[$NOW]" 
      '/^[[^ ]*] / log = ($1>=TSTART && $1<=TEND) log print '


      Step 4



      As usual, redirect output, but you should note that if you're running your script every 15 minutes, your current redirection code will overwrite the same file because %H won't have changed for at least 3 of those runs. Better make it %H-%M or something. But there's no need for any redirect-to-file at all. You can send it directly to mail (unless you really need the attachment):



       
      echo "$MESSAGE"; echo
      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log
      | mailx -S smtp="$SMTP" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"


      The rest of your script should work as-is.



      However, in the above setting, now you don't get an attachment that you can easily save with the relevant date-timestamp. And maybe you don't want to send mail if there is no output. So you could do something like this:



      OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M)
      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
      test -s $OUT &&
      echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
      rm -f "$OUT"


      This cleans up after itself so you don't have numerous diagnostic log files left hanging around.






      share|improve this answer






















      • I would another step and compress the logfile before maling it.
        – ott--
        May 24 '16 at 18:14










      • @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
        – Otheus
        May 24 '16 at 19:48










      • You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
        – ott--
        May 24 '16 at 20:17










      • How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
        – Otheus
        May 24 '16 at 21:53










      • @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
        – Puneet Khullar
        May 25 '16 at 6:32










      Your Answer







      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "106"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f285111%2fhow-to-extract-logs-between-the-current-time-and-the-last-15-minutes%23new-answer', 'question_page');

      );

      Post as a guest






























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      0
      down vote













      Regular expressions aren't a good choice for matching timestamps, as they don't really 'understand' numeric values. So instead, I'd suggest parsing the timestamp:



      #!/usr/bin/env perl
      use strict;
      use warnings;

      use Time::Piece;

      my $now = time();
      my $last = $now - 15 * 60;

      while ( <> )
      my ( $timecode ) = m/[([^.]+)/;
      print $timecode;
      my $t = Time::Piece -> strptime ( $timecode, '%Y-%m-%dT%H:%M:%S' );
      print if $t > $last;



      This can one-liner as:



      perl -MTime::Piece -ne 'print if Time::Piece -> strptime ( m/[([^.]+)/, '%Y-%m-%dT%H:%M:%S' ) > time() - 15 * 60'





      share|improve this answer
























        up vote
        0
        down vote













        Regular expressions aren't a good choice for matching timestamps, as they don't really 'understand' numeric values. So instead, I'd suggest parsing the timestamp:



        #!/usr/bin/env perl
        use strict;
        use warnings;

        use Time::Piece;

        my $now = time();
        my $last = $now - 15 * 60;

        while ( <> )
        my ( $timecode ) = m/[([^.]+)/;
        print $timecode;
        my $t = Time::Piece -> strptime ( $timecode, '%Y-%m-%dT%H:%M:%S' );
        print if $t > $last;



        This can one-liner as:



        perl -MTime::Piece -ne 'print if Time::Piece -> strptime ( m/[([^.]+)/, '%Y-%m-%dT%H:%M:%S' ) > time() - 15 * 60'





        share|improve this answer






















          up vote
          0
          down vote










          up vote
          0
          down vote









          Regular expressions aren't a good choice for matching timestamps, as they don't really 'understand' numeric values. So instead, I'd suggest parsing the timestamp:



          #!/usr/bin/env perl
          use strict;
          use warnings;

          use Time::Piece;

          my $now = time();
          my $last = $now - 15 * 60;

          while ( <> )
          my ( $timecode ) = m/[([^.]+)/;
          print $timecode;
          my $t = Time::Piece -> strptime ( $timecode, '%Y-%m-%dT%H:%M:%S' );
          print if $t > $last;



          This can one-liner as:



          perl -MTime::Piece -ne 'print if Time::Piece -> strptime ( m/[([^.]+)/, '%Y-%m-%dT%H:%M:%S' ) > time() - 15 * 60'





          share|improve this answer












          Regular expressions aren't a good choice for matching timestamps, as they don't really 'understand' numeric values. So instead, I'd suggest parsing the timestamp:



          #!/usr/bin/env perl
          use strict;
          use warnings;

          use Time::Piece;

          my $now = time();
          my $last = $now - 15 * 60;

          while ( <> )
          my ( $timecode ) = m/[([^.]+)/;
          print $timecode;
          my $t = Time::Piece -> strptime ( $timecode, '%Y-%m-%dT%H:%M:%S' );
          print if $t > $last;



          This can one-liner as:



          perl -MTime::Piece -ne 'print if Time::Piece -> strptime ( m/[([^.]+)/, '%Y-%m-%dT%H:%M:%S' ) > time() - 15 * 60'






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered May 24 '16 at 11:20









          Sobrique

          3,749517




          3,749517






















              up vote
              0
              down vote













              I'd use:



              awk -v limit="$(date -d '15 minutes ago' +'[%FT%T')" '
              $0 >= limit' < log-file


              That ignores the potential problems you may have two hours per year when the GMT offsets goes from -04:00 to -05:00 and back if daylight saving applies in your timezone.



              date -d is GNU specific, but you're using it already.






              share|improve this answer
























                up vote
                0
                down vote













                I'd use:



                awk -v limit="$(date -d '15 minutes ago' +'[%FT%T')" '
                $0 >= limit' < log-file


                That ignores the potential problems you may have two hours per year when the GMT offsets goes from -04:00 to -05:00 and back if daylight saving applies in your timezone.



                date -d is GNU specific, but you're using it already.






                share|improve this answer






















                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  I'd use:



                  awk -v limit="$(date -d '15 minutes ago' +'[%FT%T')" '
                  $0 >= limit' < log-file


                  That ignores the potential problems you may have two hours per year when the GMT offsets goes from -04:00 to -05:00 and back if daylight saving applies in your timezone.



                  date -d is GNU specific, but you're using it already.






                  share|improve this answer












                  I'd use:



                  awk -v limit="$(date -d '15 minutes ago' +'[%FT%T')" '
                  $0 >= limit' < log-file


                  That ignores the potential problems you may have two hours per year when the GMT offsets goes from -04:00 to -05:00 and back if daylight saving applies in your timezone.



                  date -d is GNU specific, but you're using it already.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered May 24 '16 at 11:44









                  Stéphane Chazelas

                  284k53524862




                  284k53524862




















                      up vote
                      0
                      down vote













                      You almost have it.



                      Step1



                      On GNU and Linux and perhaps other systems, the date command can be used to print out an arbitrary format for a time specification given a user-friendly time expression. For instance, I can get a string representing the time from 15 minutes in the past using this:



                       date --date='15 minutes ago'


                      You've pretty much done this with your code, albeit less efficiently. But you left out the microseconds. Presumably you don't actually care about them, but you do have to match them. The timezone could be done with '%:z' but presumably, you need to match the existing timezone; things might break shortly after daylight savings time switches. If you have to worry about multiple timezones, you'll need a regex or the Sobrique's solution. Caveat emptor, you could probably get away with:



                      NOW=$(date +"%FT%T.000%-04:00")
                      T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")


                      Step 2



                      You can use your NOW and T2 as inputs to awk. String-based matching will work just fine here, but awk lets you make sure that every line fits within the required time range by doing greater-than and less-than string compares.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND'


                      Step 3 (NEW)



                      You have to get the untimestamped log lines in between such timestamped lines. So we use the above awk code to match timestamp lines and when the timestamp is in the right range, set a flag. Only when the flag is set to 1, the current line is printed.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" 
                      '/^[[^ ]*] / log = ($1>=TSTART && $1<=TEND) log print '


                      Step 4



                      As usual, redirect output, but you should note that if you're running your script every 15 minutes, your current redirection code will overwrite the same file because %H won't have changed for at least 3 of those runs. Better make it %H-%M or something. But there's no need for any redirect-to-file at all. You can send it directly to mail (unless you really need the attachment):



                       
                      echo "$MESSAGE"; echo
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log
                      | mailx -S smtp="$SMTP" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"


                      The rest of your script should work as-is.



                      However, in the above setting, now you don't get an attachment that you can easily save with the relevant date-timestamp. And maybe you don't want to send mail if there is no output. So you could do something like this:



                      OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M)
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
                      test -s $OUT &&
                      echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
                      rm -f "$OUT"


                      This cleans up after itself so you don't have numerous diagnostic log files left hanging around.






                      share|improve this answer






















                      • I would another step and compress the logfile before maling it.
                        – ott--
                        May 24 '16 at 18:14










                      • @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                        – Otheus
                        May 24 '16 at 19:48










                      • You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                        – ott--
                        May 24 '16 at 20:17










                      • How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                        – Otheus
                        May 24 '16 at 21:53










                      • @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                        – Puneet Khullar
                        May 25 '16 at 6:32














                      up vote
                      0
                      down vote













                      You almost have it.



                      Step1



                      On GNU and Linux and perhaps other systems, the date command can be used to print out an arbitrary format for a time specification given a user-friendly time expression. For instance, I can get a string representing the time from 15 minutes in the past using this:



                       date --date='15 minutes ago'


                      You've pretty much done this with your code, albeit less efficiently. But you left out the microseconds. Presumably you don't actually care about them, but you do have to match them. The timezone could be done with '%:z' but presumably, you need to match the existing timezone; things might break shortly after daylight savings time switches. If you have to worry about multiple timezones, you'll need a regex or the Sobrique's solution. Caveat emptor, you could probably get away with:



                      NOW=$(date +"%FT%T.000%-04:00")
                      T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")


                      Step 2



                      You can use your NOW and T2 as inputs to awk. String-based matching will work just fine here, but awk lets you make sure that every line fits within the required time range by doing greater-than and less-than string compares.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND'


                      Step 3 (NEW)



                      You have to get the untimestamped log lines in between such timestamped lines. So we use the above awk code to match timestamp lines and when the timestamp is in the right range, set a flag. Only when the flag is set to 1, the current line is printed.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" 
                      '/^[[^ ]*] / log = ($1>=TSTART && $1<=TEND) log print '


                      Step 4



                      As usual, redirect output, but you should note that if you're running your script every 15 minutes, your current redirection code will overwrite the same file because %H won't have changed for at least 3 of those runs. Better make it %H-%M or something. But there's no need for any redirect-to-file at all. You can send it directly to mail (unless you really need the attachment):



                       
                      echo "$MESSAGE"; echo
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log
                      | mailx -S smtp="$SMTP" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"


                      The rest of your script should work as-is.



                      However, in the above setting, now you don't get an attachment that you can easily save with the relevant date-timestamp. And maybe you don't want to send mail if there is no output. So you could do something like this:



                      OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M)
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
                      test -s $OUT &&
                      echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
                      rm -f "$OUT"


                      This cleans up after itself so you don't have numerous diagnostic log files left hanging around.






                      share|improve this answer






















                      • I would another step and compress the logfile before maling it.
                        – ott--
                        May 24 '16 at 18:14










                      • @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                        – Otheus
                        May 24 '16 at 19:48










                      • You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                        – ott--
                        May 24 '16 at 20:17










                      • How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                        – Otheus
                        May 24 '16 at 21:53










                      • @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                        – Puneet Khullar
                        May 25 '16 at 6:32












                      up vote
                      0
                      down vote










                      up vote
                      0
                      down vote









                      You almost have it.



                      Step1



                      On GNU and Linux and perhaps other systems, the date command can be used to print out an arbitrary format for a time specification given a user-friendly time expression. For instance, I can get a string representing the time from 15 minutes in the past using this:



                       date --date='15 minutes ago'


                      You've pretty much done this with your code, albeit less efficiently. But you left out the microseconds. Presumably you don't actually care about them, but you do have to match them. The timezone could be done with '%:z' but presumably, you need to match the existing timezone; things might break shortly after daylight savings time switches. If you have to worry about multiple timezones, you'll need a regex or the Sobrique's solution. Caveat emptor, you could probably get away with:



                      NOW=$(date +"%FT%T.000%-04:00")
                      T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")


                      Step 2



                      You can use your NOW and T2 as inputs to awk. String-based matching will work just fine here, but awk lets you make sure that every line fits within the required time range by doing greater-than and less-than string compares.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND'


                      Step 3 (NEW)



                      You have to get the untimestamped log lines in between such timestamped lines. So we use the above awk code to match timestamp lines and when the timestamp is in the right range, set a flag. Only when the flag is set to 1, the current line is printed.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" 
                      '/^[[^ ]*] / log = ($1>=TSTART && $1<=TEND) log print '


                      Step 4



                      As usual, redirect output, but you should note that if you're running your script every 15 minutes, your current redirection code will overwrite the same file because %H won't have changed for at least 3 of those runs. Better make it %H-%M or something. But there's no need for any redirect-to-file at all. You can send it directly to mail (unless you really need the attachment):



                       
                      echo "$MESSAGE"; echo
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log
                      | mailx -S smtp="$SMTP" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"


                      The rest of your script should work as-is.



                      However, in the above setting, now you don't get an attachment that you can easily save with the relevant date-timestamp. And maybe you don't want to send mail if there is no output. So you could do something like this:



                      OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M)
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
                      test -s $OUT &&
                      echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
                      rm -f "$OUT"


                      This cleans up after itself so you don't have numerous diagnostic log files left hanging around.






                      share|improve this answer














                      You almost have it.



                      Step1



                      On GNU and Linux and perhaps other systems, the date command can be used to print out an arbitrary format for a time specification given a user-friendly time expression. For instance, I can get a string representing the time from 15 minutes in the past using this:



                       date --date='15 minutes ago'


                      You've pretty much done this with your code, albeit less efficiently. But you left out the microseconds. Presumably you don't actually care about them, but you do have to match them. The timezone could be done with '%:z' but presumably, you need to match the existing timezone; things might break shortly after daylight savings time switches. If you have to worry about multiple timezones, you'll need a regex or the Sobrique's solution. Caveat emptor, you could probably get away with:



                      NOW=$(date +"%FT%T.000%-04:00")
                      T2=$(date --date='15 minutes ago' +"%FT%T.000%-04:00")


                      Step 2



                      You can use your NOW and T2 as inputs to awk. String-based matching will work just fine here, but awk lets you make sure that every line fits within the required time range by doing greater-than and less-than string compares.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND'


                      Step 3 (NEW)



                      You have to get the untimestamped log lines in between such timestamped lines. So we use the above awk code to match timestamp lines and when the timestamp is in the right range, set a flag. Only when the flag is set to 1, the current line is printed.



                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" 
                      '/^[[^ ]*] / log = ($1>=TSTART && $1<=TEND) log print '


                      Step 4



                      As usual, redirect output, but you should note that if you're running your script every 15 minutes, your current redirection code will overwrite the same file because %H won't have changed for at least 3 of those runs. Better make it %H-%M or something. But there's no need for any redirect-to-file at all. You can send it directly to mail (unless you really need the attachment):



                       
                      echo "$MESSAGE"; echo
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log
                      | mailx -S smtp="$SMTP" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"


                      The rest of your script should work as-is.



                      However, in the above setting, now you don't get an attachment that you can easily save with the relevant date-timestamp. And maybe you don't want to send mail if there is no output. So you could do something like this:



                      OUT=/tmp/oim_server1-diagnostic_$(date +%F-%H-%M)
                      awk -v TSTART="[$T2]" -v TEND="[$NOW]" '$1>=TSTART && $1<=TEND' $LogDir/oim_server1-diagnostic.log > "$OUT"
                      test -s $OUT &&
                      echo -e "$MESSAGE" | mailx -S smtp="$SMTP" -a "$OUT" -r "$SENDER" -s "$SUBJECT" "$EMAIL1"
                      rm -f "$OUT"


                      This cleans up after itself so you don't have numerous diagnostic log files left hanging around.







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited May 25 '16 at 13:22

























                      answered May 24 '16 at 12:16









                      Otheus

                      3,207730




                      3,207730











                      • I would another step and compress the logfile before maling it.
                        – ott--
                        May 24 '16 at 18:14










                      • @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                        – Otheus
                        May 24 '16 at 19:48










                      • You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                        – ott--
                        May 24 '16 at 20:17










                      • How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                        – Otheus
                        May 24 '16 at 21:53










                      • @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                        – Puneet Khullar
                        May 25 '16 at 6:32
















                      • I would another step and compress the logfile before maling it.
                        – ott--
                        May 24 '16 at 18:14










                      • @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                        – Otheus
                        May 24 '16 at 19:48










                      • You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                        – ott--
                        May 24 '16 at 20:17










                      • How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                        – Otheus
                        May 24 '16 at 21:53










                      • @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                        – Puneet Khullar
                        May 25 '16 at 6:32















                      I would another step and compress the logfile before maling it.
                      – ott--
                      May 24 '16 at 18:14




                      I would another step and compress the logfile before maling it.
                      – ott--
                      May 24 '16 at 18:14












                      @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                      – Otheus
                      May 24 '16 at 19:48




                      @ott-- Why? Then the user gets a logfile they cannot read, or one that takes at least several steps to decompress.
                      – Otheus
                      May 24 '16 at 19:48












                      You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                      – ott--
                      May 24 '16 at 20:17




                      You should not encrypt it so the user can't read it. It's just not polite to send an attachment of 15 MB while the gzipped file is 1.5 MB in size only. And less can show it without an extra step. Remember this is Unix & Linux, not Windows or MacOSX.
                      – ott--
                      May 24 '16 at 20:17












                      How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                      – Otheus
                      May 24 '16 at 21:53




                      How do you know the system receiving the emails is UNIX? Oh well, the OP can figure out if and how he wants to compress it.
                      – Otheus
                      May 24 '16 at 21:53












                      @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                      – Puneet Khullar
                      May 25 '16 at 6:32




                      @Otheus the log file genertion is incomplete. I am going to edit the question please refer that.
                      – Puneet Khullar
                      May 25 '16 at 6:32

















                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f285111%2fhow-to-extract-logs-between-the-current-time-and-the-last-15-minutes%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Popular posts from this blog

                      How to check contact read email or not when send email to Individual?

                      Bahrain

                      Postfix configuration issue with fips on centos 7; mailgun relay