Sunday 10 January 2016

Grey Text Always Makes Me Cry

What's with all this grey text on the web these days? It makes my eyes water. Some years ago I started to have trouble reading the tiny text in newspapers and on some websites. At first, I put it down to having had a couple of beers too many the night before, the glare of sun on the page, or other random factors.

When I bought my first smartphone, I realized my eyesight had deteriorated somewhat, due to middle age. My previously 20-20 perfect vision had succumbed to the ravages of time, and I had to invest in reading glasses. I couldn't focus on my phone unless I held it at arms-length from my face, at which point everything was too small to see. That's what the mobile revolution meant for me.

Reading on my computer didn't present a problem. The computer monitor is far enough away for old eyes to focus on, and when the text is too small you can hit ctrl+. However, around 5 years ago began the proliferation of light grey text on 'less than white' backgrounds. What gives?

Back in the 1990s, in the previous century when the World Wide Web was born, there were a lot of (mostly failed) experiments with font-color/background-color combinations. Any combination of colors you can think of were tried. Freed from the simple black ink on white paper publishing paradigm that had confined us for a thousand years or more, writers began to write yellow text on green backgrounds, red on blue, and so on. There was a proliferation of eye-sores that was unprecedented in history. One favorite, that still persists on some websites today was white text on a black background. Hey! We are now in the digital age, and we can turn the old standards on their head; that is what a paradigm shift is all about, yes?

No. It sucks. If it brings tears to your eyes, it sucks. If it makes your eyes bleed, it worse than sucks.

Back in the day people also used frames, flash landing pages, and under-construction icons. Such things were orthodox: but they sucked, big time.

Websites using grey text, including light-grey text, and even, God-forbid, light-grey text on grayish backgrounds have proliferated in the last 5 years. Why is this so?

Human psychology might give us a clue. Most people are followers, and as such, throughout history from ancient times until today, religions, false-beliefs, and cults have, and continue to spring up. This includes the practice of  using grey text on web pages. The practice is grounded on ignorance, and when ignorance prevails, even smart people follow.

I believe the cause of this gray text trend is caused by Content Management Systems being made freely available to people who want to publish on the Internet. There's a price to pay for everything that comes free. The developers of content management systems needed to put their product out for free, but also needed to recruit paying users. One way of doing this was to make the font/text of the content management system gray. If the end user couldn't figure out how to change the font-color to black, they would have to pay the CMS creator to help them to do that.

This is a business model that ultimately failed. While it one would assume that the greying out of text would cause web publishers to seek a solution, most of the new publishers failed to grasp one of the fundamental and basic principles of publishing: readability. The proliferation of websites that used grey text grew to the point that new web publishers now think that grey text on a white or less-than-white background is the standard and de rigueur color for fonts on the World Wide Web. After all, everybody else is doing it, right?






Monday 4 January 2016

How Social Networking Buttons Can Frack Your Website, Slow it Down, and Prevent W3C Validation

I work hard to keep my code clean, my web pages loading fast, and my HTML and CSS validated to the most up-to-date W3C specifications. Recently, I overhauled an old website of mine originally created using HTML 4.01 transitional. I spent a couple of months working on hundreds of pages to get them to validate to HTML 5 [W3C Website Validation Service]. At the same time, I used Google's Page Speed Insights to improve page speed and fix mobile usability issues.

Having accomplished all of this, I wanted to share the results of my hard work with the world, so I went searching for information on how to include social sharing buttons on my web pages. I already had social sharing buttons on my WordPress sites where adding share buttons is as simple as clicking on a few buttons to install a plugin, but how to include share buttons on a HTML website?

The most popular method of adding a Facebook like button to your webpage is to use Facebook's Like Button Configurator  which will produce two pieces of javascript to add to your blog. One piece of code must be placed immediately after the BODY tag, and the other piece of code placed where you want the button to appear on your page. Like a mug, I went ahead and placed these pieces of code on every one of my hundred or so pages. I probably spent about 40 to 50 hours doing this. How could I be wrong? After all, when you search for 'how to add a Facebook button to your webpage', it's what all the techie blogs and 'experts' recommend, and it gives you a Facebook like button exactly the same as everyone else is using.

It was about a month later that I was using Google Webmaster Tools to do a general check-up for crawl errors or other usability issues when I came up with a profound shock - my web pages had mobile usability issues due to slow download speed. How could this be? I had worked hard and succeeded to have every page score over 90 points out of 100 on Google's Page Speed Insights. My page speed had dropped to less than 80 points, and some were in the red zone, signifying serious usability issues for mobile device users.

The problem was, of course the Facebook like button. This one little piece of code slowed down each page on which I placed it by 15 points on Page Speed Insights - about a 15% slowdown in the real world. The Facebook like buttons had to go, but surely there must be a simple solution?

<a class="top_social" href="javascript:window.location=%22http://www.facebook.com/sharer.php?u=%22+encodeURIComponent(document.location)+%22&#38;t=%22+encodeURIComponent(document.title)" title="Share on Facebook..."><img src="LINK TO YOUR OWN FACEBOOK LIKE BUTTON IMAGE" alt="share this page on facebook"></a>


Note you have to grab a facebook like button image from somewhere and add it to your site. I just found one and cropped, resized, and changed the background button to suit my site's background color. Job done, now I have a Facebook like button on every page, with no download slowdown, and my pages score from the mid to high 90s on Google Page Speed Insights.

Next, I wanted to add Google+ buttons to my site. Now, you would expect that everything made by the world's #1 Internet company, and one that is so fussy about the quality of your webpages, would be A1 in quality and made to the standards and specifications of the World Wide Web Consortium. But no. After adding Google+ buttons to all of my webpages, I found they no longer validated to HTML 5.

<a href="https://plus.google.com/share?url={http://YOUR WEBPAGE/}" onclick="javascript:window.open(this.href,
  '', 'menubar=no,toolbar=no,resizable=yes,scrollbars=yes,height=600,width=600');return false;"><img
  src="https://www.gstatic.com/images/icons/gplus-32.png" alt="Share on Google+"/></a>

The problem here is '{' and '}'. These characters are not supported by UTF-8 and need to be replaced by '%7b' and '%7d' respectively.

So finally, what I expected would be a simple task of adding social sharing buttons to my web pages, has been accomplished, after several months of work and some serious mistakes made by following the procedures recommended by the social sharing networks themselves and the various tech blogs parroting each others expert opinions and recommendations. I hope this information may help you avoid making these same mistakes.

You can see the final results on my website Atherton Tableland Netguide

Just a Blog

I set up this blog  around 6 years ago, not really knowing what to do with it. I wasn't really into 'blogging', and had always been suspicious of the term, and the lexical ugliness of the word and its offshoots - blogging, blogger, blogosphere, etc.

Since 2006, I've been using WordPress as a content management system for The Wild East Magazine and other websites in my network. It has proven versatile, and is far more than a blog platform. Today it can support magazines, newspapers, photo galleries, and many more applications.

I have six websites to actively maintain, and 5 of them are managed using WordPress. Over the years, I have learned to install themes, tweak them, make child themes, and sometimes get into the php. I am resistant to computer code. I don't like doing it. Don't get me wrong; I appreciate coding, and like what it can do. I just want to write using a human language, rather than learning how to communicate with machines, bots, and other non-human user-agents.

Last year, at this time, I started to sacrifice the time I spent working on my WordPress sites to pay attention to my long-neglected old website Atherton Tableland Netguide which was originally made using HTML and CSS. Sometimes, as I crawl through the hundreds of pages on that site, like a human web-bot, updating design and content, I think 'this would be much easier if I used php.' But I don't want to do that.

When you consider digital applications, they are a reflection of the physical world. Automation and standardization are desirable attributes in the world of economics. The benefits of automation and standardization can be carried over into the information economy, and they have been much. Being human, however, is more than a matter of being economic, efficient, automated, and standardized.

Cockroaches are one of the most efficient lifeforms on our planet. In fact, most species on earth could be categorized as more efficient than humans. Can you think of a particular species that is less efficient than ours?

The more simple the life-form, the more automated the responses: hence we call our new digital user-agents, web-crawlers and spiders after well-established species that preceded us by hundreds of millions of years, and bots as a new digital version of our invertebrate ancestors. We are just attempting to re-invent the wheel.

There is still a place for art in this world, both online, and in the physical world we inhabit. As artists, we can make use of improvements in efficiency; but automation and standardization? What value are these?