big numbers
Thursday is sons and daughter day at work so I was thinking I should get some sort of number of the amount of traffic we do in a certain period of time to throw out. I knew we were sitting at around 50 million page views per month for the site, but we get a whole lot of image scarfing attempts our pages are pretty asset heavy. So I thought I’d get a count of how many requests the sites are handling. A few server jumps with a zcat here and wc there and when all was said and done I come out to about 45 million requests per day, 30 million of those are to our image servers. These totals were coming off a Monday which is a good traffic day and we normally peak then and drop a little through the rest of the week. Still though I’ve got to believe we’re doing over 1 billion request a month. 1 billion requests *Dr.Evil laugh*
I was also working a little bit tonight on pulling in old email archives. In the last year I’ve moved from a HP laptop to a MacBookPro running OSX to the MBP now running Ubuntu. I haven’t gotten around to pulling all my old email archives into my latest build. So that was the project for tonight and I found my default mac2unix perl script
while (<stdin>) {
s/\r/\n/gi ;
print $_ ;
}
wouldn’t do the job as my mbox files surpassed 200MB. Replacing it with a
tr '\r' '\n' < infile > outfile
did the trick nicely. I put all my archive mail in Claws and set up a nonfunctional email account. Ideally this is a good place for me to pull things out of my daily churn as the get old and set them aside for reference. I really think I need to keep Evolution slim and whatever I’ve got on the Exchange server at bare minimum to get the performance I’d like to see.