Quoting Dave Sherohman (esper at sherohman.org): > I'm sure the spiders have plenty of non-TCLUG copies of LKML in their > databases already. Have you considered setting up a robots.txt to > kill off that load, even if only as a temporary measure until the > LKML injection is complete? It wouldn't solve the performance > problems, but I would expect it to help some. > > BTW, if python developers are found to work on mailman, let me know. > I'd be happy to throw some money in the pot. Good idea, but that kind of defeats the purpose of the archives. I want them indexed so people (myself included) can search them. -- Minneapolis St. Paul Twin Cities MN | Phone : (952)943-8700 http://www.mn-linux.org Minnesota Linux | Fax : (952)943-8500 Key fingerprint = 6C E9 51 4F D5 3E 4C 66 62 A9 10 E5 35 85 39 D9