<br><br><div class="gmail_quote">On Tue, Apr 7, 2009 at 2:19 PM, william pink <span dir="ltr"><<a href="mailto:will.pink@gmail.com">will.pink@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div><div></div><div class="h5">On Tue, Apr 7, 2009 at 11:57 AM, - Tethys <span dir="ltr"><<a href="mailto:tethys@gmail.com" target="_blank">tethys@gmail.com</a>></span> wrote:<br></div></div><div class="gmail_quote">
<div><div></div><div class="h5"><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div><div></div><div>On Tue, Apr 7, 2009 at 11:11 AM, william pink <<a href="mailto:will.pink@gmail.com" target="_blank">will.pink@gmail.com</a>> wrote:<br>
<br>
> I have the rather horrible task of splitting up lots (40Gb's worth) of<br>
> Apache log files by date, the last time I did this I found the line number I<br>
> then tailed the file and outputted it into a new file which was a long<br>
> arduous task. I imagine this can be done in a few minutes with some<br>
> Regex/Sed/AwkBash trickery but I wouldn't know where to start can anyone<br>
> give me any pointers to get started?<br>
<br>
</div></div> #!/bin/bash<br>
<br>
indatefmt="+%d/%b/%Y"<br>
outdatefmt="+%Y-%m-%d"<br>
<br>
start_date="mar 25"<br>
end_date=$(date "$indatefmt")<br>
<br>
count=0<br>
while true<br>
do<br>
indate=$(date "$indatefmt" -d "$start_date + $count days")<br>
outdate=$(date "$outdatefmt" -d "$start_date + $count days")<br>
<br>
fgrep "$indate" big_logfile > "small_logfile.$outdate"<br>
<br>
[ "$indate" = "$end_date" ] && break<br>
((count++))<br>
done<br>
<br>
It's a bit inefficient, as it scans the log file multiple times,<br>
but for comparatively small log files like you have, that shouldn't<br>
be too arduous. It'll also pick up any entries that happen to have<br>
the date format you're looking for in the URL, for example. To work<br>
around either of those, using a scripting language like python or<br>
perl to read and examine each line in turn is probably the right<br>
solution. But the quick and dirty approach above will probably be<br>
fine for you.<br>
<br>
Then fix your setup so it logs to per-date files to start with...<br>
<br>
Tet<br>
<br>
--<br>
The greatest shortcoming of the human race is our inability to<br>
understand the exponential function -- Albert Bartlett<br>
<font color="#888888">--<br>
</font><div><div></div><div>Gllug mailing list - <a href="mailto:Gllug@gllug.org.uk" target="_blank">Gllug@gllug.org.uk</a><br>
<a href="http://lists.gllug.org.uk/mailman/listinfo/gllug" target="_blank">http://lists.gllug.org.uk/mailman/listinfo/gllug</a></div></div></blockquote></div></div><div><br>Hi Tet,<br><br>Thats just what I needed, I promise to practice my bash scripting while this script runs through these logs.<br>
<br><br>Many Thanks,<br>Will<br></div></div></blockquote><div><br>One question how can I adjust it so I can use multiple log files? I have tried but I keep breaking it.<br></div></div><br>Thanks,<br>Will<br>