I wanted to write since it seems you're getting some flak in response to your well articulated blog. You probably already know this well enough, but it is something that Vaughan-Nichols has been publishing, and I have been trying to convey in similar published articles. Only those who've observed MS business strategy over the years seem to appreciate the key elements of your statement.
So, to help counteract some of that flak, I wanted to say THANK YOU for
holding a position, standing on substance, and being candid.
—
Mark Rais
Note: This letter is in response to blog articles published exclusively on the www.linuxjournal.com site: “Novell Is Loading Microsoft's Gun”, www.linuxjournal.com/node/1000129 and “A Five Year Deal with Microsoft to Dump Novell/SUSE”, www.linuxjournal.com/node/1000121. —Ed.
In the November 2006 Letters section, Marcel Gagnéwrites, “With a very few exceptions (such as Mr DeSouza), I get nothing but praise for my articles.”
Let me cause another exception for Mr Gagnéand add my voice to those
who think the style of Mr Gagnés articles are extremely annoying.
Usually I just scan the article looking for the actual content, as the
fluff is too cheesy and bothersome to wade through. Asking around my
department (there are many avid readers of Linux Journal here), no one
else even bothers to read the article because of the style.
—
Chris Russell
With apologies to Marcel for inadequately borrowing his style, Francois serves a sweet white wine to some and a rich full-bodied red to others. Most of our readers love Marcel's column. Your exception is respected and noted. —Ed.
Every time I was starting a new project, Linux Journal beat me to it with an introduction article. When I wanted to go PPC, my new issue just arrived. When I was reading about Qtopia, my new issue arrived. When they introduced the Nokia 770, my issue arrived to tell me all about its UI.
I don't expect every article to be meaningful for me, but I do expect something from every issue. Telling us about x264 was great (the part about QuickTime users in specific), but the rest of the issue needs to be refocused.
What I would like to see? How about a technical discussion of how the
Zaurus “Sharp ROM” is put together? How about a discussion on how to get
the 770's window system working for a workstation? How about an article
about the Qtopia phone? Does Trolltech offer a GPL phone edition for
use with the new Wi-Fi IP SIP phones (for example, WIP300)? I could go on and on
about the growing trends in these directions.
—
J.
Thanks for the suggestions. We welcome input like this. —Ed.
I enjoyed Girish Venkatachalam's article in the August 2006 issue of LJ about developing P2P protocols across NAT. I was particularly interested in some assertions made and was wondering if Girish can provide some references to those.
In particular, “At least 50% of the most commonly used networking applications use peer-to-peer technology.” This doesn't seem right.
I always thought that, being connection-based, TCP was a lot easier to
NAT than UDP, although some TCP applications make it harder by including
IP addresses in the application layer part of the datagrams (such
as FTP in “normal” mode), requiring the NAT router to have to inspect
and modify every FTP datagram.
—
Bob
Regarding the yellow warning about SSH dictionary attacks [LJ, December 2006, “A Server (Almost) of Your Own” by George Belotsky], try the wonderful DenyHosts (www.denyhosts.net).
DenyHosts monitors the incoming connections into your server (mainly SSH, but it can be FTP, POP or anything else with a login/password and log file) and blocks source IP addresses by automatically adding entries to /etc/hosts.deny.
I put it to run on a brand new Web server a month ago, and it has already more than 7,000 forbidden addresses!
Besides, you also can share your blacklist with DenyHosts' Web site,
feeding a mega-blacklist of the really bad guys.
—
Carlos Vidal
I've been waiting and waiting and waiting for Java articles. How about
Java applets (Ajax before Ajax), Java servlets, Swing (and real cross-platform stuff). I look forward to hearing and reading about it.
—
Mark Molenda
We have just such an article in the works! —Ed.
In the May 2006 issue of LJ, there was an article by Dee-Ann LeBlanc on the above subject. Unfortunately, the emulator that Dee-Ann recommended is still available but unsupported for Linux; their Linux guy left the project. But, all is not lost; Linux will not be beaten. There is a new Linux emulator starting called PCSX2, and it can be found at pcsxii4unix.sourceforge.net. The new version is not completed yet; they need help.
Thanks LJ for the best Linux mag on the continent.
Always remember:
there are Linux users, and then there is the rest of society.
—
Des Cavin
In his letter [December 2006 issue of LJ], Jon Alexander described how he surprised his friend by
logging in with different desktops. If he really wanted to impress his
friend, he could have logged in with multiple desktops and then switched
between them. He even could have included a couple of remote desktops
for good measure. Also, don't forget about the virtual desktops that
Linux supports.
—
James Knott
It was interesting to read Jon “maddog” Hall's experience in his article titled “Soweto: Power from the People” [see the UpFront section of the December 2006 issue of LJ] and compare it with an experience our computer club had a few years ago.
The club committee decided that if the club members would donate their redundant computer equipment to the club, the club would donate this equipment to a Soweto school and help them set it up as a local network and subsequently connect to the Internet.
The installation went fine, and the local network also worked; however,
when the installation team arrived to connect the school to the Internet,
what did they find? Every single piece of equipment had either
been stolen or broken. It was subsequently discovered that some of the
teachers were responsible for some of the theft.
—
Alf Stockton
I admit that I did not read the original article “Analyzing Log Files” [October 2006 issue of LJ] by Dave Taylor. I did see the “Optimal awking” letter however [Letters, December 2006 issue of LJ]. Being an old “bit twiddler”, I was interested in the enhanced run-time mentioned by reordering the original:
awk '{ print }' access_log | sort | uniq -c | \ sort -rn | grep "\.html" | head
to:
awk '{ print }' access_log | grep '\.html' | sort \ | uniq -c | sort -rn | head
Now, I'm not really an awk person, but I was curious as to what the awk program did. Apparently, it is just an expensive version of cat—that is, it copies its input to stdout, unchanged. In that case, why even have it? Also, why use grep? Instead, use fgrep, which, in this case, produces the same result with somewhat less overhead:
fgrep '.html' access_log | sort | uniq -c | sort -rn | head
[This] should produce the same output and totally eliminate the awk. In this case, no big deal. Unfortunately, many neophytes will pick up a script from a magazine and use it without really understanding it. So, I am a bit picky about examples. For a one-shot, this is no big deal. But I am a bug for efficiency—comes from programming back on 1MHz 8080s, I guess.
Unfortunately, I don't have a Web access_log to do any testing to see
if this really makes much of a difference.
—
John McKown
I hesitate to take issue with an authority like Dave Taylor, but [his script] uses repeated (and in many cases redundant) system calls and divisions to achieve what simple multiplications can do [see Dave's December 2006 column]. It also converts the results to the wrong units. (See man units for a discussion.)
A simple algorithm to achieve the same effect is embedded in a test harness as follows:
#!/bin/sh # Script for numeric scaling - $1 = number, $2 = iterations for (( i = 1; i <= $2; i++ )) do ki=1024 mi=$(($ki*$ki)) # 1048576 without typo risks gi=$(($ki*$mi)) # 1073741824 without typos value=$1 if [ $value -lt $ki ] ; then units="bytes" elif [ $value -lt $mi ] ; then units="KiB" div=$ki # < 1 Mi, so calculate Kibytes elif [ $value -lt $gi ] ; then units="MiB" div=$mi # < 1 gi, so calculate Megs else units="GiB" div=$gi # >=1 gi, so calculate Gigs fi if [ $units != "bytes" ] ; then # scale value appropriately value=$(echo "scale = 2; $value / $div" | bc ) fi echo "$value $units" done # End tcon2
Running 1,000 iterations of each on an HP laptop with an AMD 2500 chip
showed the revised version to take approximately one-quarter of the time
(real, user, and system) of the original.
—
Alan Rocker
I don't think Dave Taylor gives awk enough credit [see Dave's November 2006 column]. I do not have access to the same Web files, logs or version of Linux. However, I do know that his solution can be written entirely with awk. Using AIX and HP-UX, I did dummy up a mail log file, cheated on the date command and tested my awk solution.
Below is awk code that I think would duplicate Dave's example:
#!/bin/sh LOGFILE="/home/limbol/logs/intuitive/access_log" awk ' ( index($0, YESTERDAY) ) { hits++; bytes+=$10 next} END { printf("Calculating transfer data for %s\n", YESTERDAY) printf("Sent %d bytes of data across %d hits \n", bytes, hits) printf("For an average of %d bytes/hit\n", (bytes / hits) ) printf("Estimated monthly transfer rate: %d \n", (bytes * 30) ) } ' YESTERDAY="$(date -v-1d +%d/%b/%Y)" ${LOGFILE}
I found it a little bit ironic that you chose Ubuntu as the 2006 Editors' Choice Linux distribution, but that you chose KDE as your Editors' Choice desktop environment. It seems to me that if you were going to pick Ubuntu, you'd choose GNOME, and if you were going to choose KDE, you would choose Kubuntu. Are there reasons you picked Ubuntu over Kubuntu, or did you simply mean (K)ubuntu in general for your Editors' Choice distribution?
PS. Long live KDE!
—
Geoff
As we said in our write-up, we also find it a puzzler as to why Ubuntu seems to be a favorite, yet research data shows people prefer KDE over GNOME by a significant margin. Perhaps people refer to all variants of Ubuntu as Ubuntu, even if what they're really using is Kubuntu. Or, maybe others do like some of us at Linux Journal do. Some of us at Linux Journal install Ubuntu and then install and use KDE (thus essentially converting it to Kubuntu). —Ed.
Some of the code was inadvertently formatted incorrectly in George Belotsky's “A Server (Almost) of Your Own” in our December 2006 issue. For the corrected version of the article, please see /article/8337.