I am almost constantly online since 1996, and I am overall very pleased with the development of the internet to a universal platform to acquire information, to work on common projects, to exchange opinions, to do some shopping and banking and to have fun. Some trends, however, have become increasingly annoying, for me as a webmaster of a big site as well as a "normal" user.
Disclaimer While I'm doing my best to stick to the facts in this commentary, please note that some of the observations may be outdated and that my perception may be very personal.
Sure. Who doesn't hate it? But most of all I am astonished that the overall spam volume is still rising after so many years, although there can be close to no return from it. Currently the vast majority of all e-mail messages has to be considered spam. If it hasn't already been filtered out automatically, every human being who is not a complete moron can recognize spam in milliseconds. This used to be different in the mid-90s when I actually read some of the then infrequent spam mails, especially the amusing ones pretending to come from former African dictators.
I reckon that when some dick sends out 10 million spam messages today, only some 500,000 will arrive and get noticed at all for a split second. Out of these 500,000 human spam recipients, maybe 5000 will read more than a few words. Of these 5000, only 50 will click a link to penis enlargement because, if there is any information provided, it is quickly recognizable as fraudulent or at least untrustworthy (for instance, because of bad spelling). Statistically, I believe that at most 1 of these 50 people will buy anything. And this is only possible under the assumption that the site is working and sells anything at all and is not just another dead link or a redirect to something totally different - many spam-advertised sites appear to be dead ends!
So even if it is possible to earn a few cents per 10 million spam messages simply because someone pays for the page clicks, there can never be sufficient return to justify the (admittedly small) effort of sending out all these mails. Common sense tells me that whole system should be increasingly unlikely to work, even if it were legal. And if sending a single e-mail cost only 0.1 cent, the spam business should have collapsed long ago. So why is it still thriving? Unless the spammers themselves are complete idiots (which is what most spam messages look like), spam must be a miraculous way to find the needle in the haystack, the one out of ten million who is still gullible enough.
On a side note, regarding the customary "I need your bank account to transfer my money" fraud (the Nigeria scam), I read about an investigation that came to the startling result that intellectual people are much more likely to fall for it than users with average education!
Of all fraudulent internet manipulations phishing is the probably most insidious one. Because you can never be careful enough. I sometimes spend a few minutes until I can rule out the authenticity of a message from a bank, from PayPal or from eBay. The spoofed messages very often look quite genuine.
Here is what I do when I suspect that I have received a phishing mail (in about this order):
- Have I ever given away my e-mail address to my bank or any other company or agency (especially regional)? If they don't know my e-mail address, the mail can't be from them.
- Have I received e-mails from them before? If the suspect e-mail titled like "Your Account Will be Suspended" is the first one I ever receive, I discard it because very likely the e-mail doesn't come from them.
- Is the language in the mail correct? I have a German eBay account, and English or Spanish mails pretending to come from eBay are likely phishing attempts.
- Is the e-mail header correct? Not the "From:" field in the header is relevant because the phisher can put anything in this field, such as "<eBay> firstname.lastname@example.org". The decisive criterion is the server from which the message was sent; it must be made visible in the option menu of the e-mail client or by saving the whole message as plain text. If it's something like someserver.ru, then the e-mail is from Russia without love and not from your local bank (unless you're Russian, of course, no offense meant!).
- Is the URL of the site that you're referred to correct? Most fabricated URLs should be obvious. The URL "secureserver.ebay.evilurl.com" can't be an eBay server because the second-level (second to last in the URL) domain is the decisive mark. I am surprised how many people still don't know how to read URLs after so many years and think that everything is okay if just the third or fourth level looks trustworthy!
Modern web design and SMO
Moreover, it has become a strange habit that company websites don't readily give away what the company is about. Very often there is nothing like an introductory paragraph such as "[Techcomp] is a leading provider of [tech]..." It almost seems that today's cutting edge technology companies all produce and sell the same: huge pictures, which are further "explained" with buzzwords such as "Create", "Experience", "Share", "Get involved".
Many years ago some websites tried to catch the attention of visitors with splash screens before showing any actual navigation aids or any content. This practice was commonly frowned upon for very good reasons and was ultimately abandoned in the early 2000s. Likewise, hidden or animated navigation elements were criticized as "Mystery Meat Navigation" and usually didn't last for long either. It seems the latest generation of web designers repeat all the mistakes of the past and worse. Accessibility was yesterday. The absence of a clear navigation structure is not only a problem immediately after coming to a website but often continues all the way through the sub-pages. It looks like "modern" websites don't even want that their "traditional" in-depth content can be explored by visitors, even if this content still exists.
There are several rationales for this development, a few of which are of technical nature and at least partially understandable. It is clear that for mobile devices a horizontal navigation bar with eight to ten items side by side is not practical any longer. However, while some of the better modern sites offer an extendable menu with the basic links in their mobile versions, they don't care for users of PCs, who are left without a navigation tailored to their needs. And what about those who want to really explore the content, and not only the most recent or "most popular" features? After scrolling or wiping all the way down the page it is even harder to hit the items in the tiny ersatz navigation bar than on conventional web pages. The accessibility is significantly reduced, *both* on mobile devices and PCs. And don't get me even started about loading megabytes of images on a smartphone without Wi-Fi or LTE!
Summarizing, SMO has established the following unfortunate trends:
- Sharing news is deemed more important than understanding news.
- In-depth content is removed or is made hard to find, in favor of dumbed down news.
- Websites lack uniqueness in design and content. You get the impression you can find everything on Facebook too.
- Megabytes of huge images and animations dominate everything - function follows form.
- The needs of PC users are disregarded as if everyone accessed the internet with smartphones.
Empty spaces as links
One web design habit has recently become so irksome that it deserves a point of its own in this shit list. I'm talking of empty spaces on a web page that are misused as links. The annoyance can be found with increasing frequency in portals with ads (see below) where not just the banner but the whole space to the left and the right serves as the link. When I try to mark portions of the page or only accidentally click the area, I am in for an unpleasant surprise. Sometimes the whole screen with the exception of the customarily tiny useful area of a commercial site becomes a link to the sponsor. And I am absolutely certain that on some sites I am being redirected to an ad by just hovering above the background!
The phenomenon, however, is not only symptomatic of overly aggressive advertising. It can also be found in apparently well-meant designs, because some people think if there's a link somewhere in a table, the whole table cell has to become a single big link, without being marked as such (which, just like the hidden part of ads, prevents me from marking the text, at least on the first try that takes mw who knows where).
Finally, some image hosts such as Imageshack or Photobucket have imposed so much overhead on the image display that I never know whether the original doesn't exist any longer or whether due to some browser problem, script glitch or delayed loading of ads it simply isn't displayed. Whenever I see an embedded image coming from Imageshack or Photobucket anywhere on another site (such as a BBS), I don't even bother clicking it because they have turned inaccessibility into an art. Imageshack or Photobucket are effectively image viewing prevention services and are among the currently biggest internet annoyances.
This technique sounds like a good idea because it seems to allow smooth browsing, especially on mobile devices. However, it comes with severe disadvantages. Firstly, it is a dynamic technique, and whenever it doesn't work perfectly (for example, if the server doesn't respond or the connection deteriorates, especially on mobile devices), it causes the page to hang or to load incompletely. In such a case the visitor has to hit "Reload" and start all over from the very top of the page. It usually takes several minutes to load everything again and find the point where the transfer was interrupted. Infinite scrolling is not just a problem for users with slow connections but also for those with less powerful hardware. At some point their CPU will burn and/or they will run out of memory. The immense amount of content packed on a single page, accompanied by various scripts that are running all the time, is too much for older processors and for memories of less than 2 gigabytes. And finally, even for those with fast enough hardware it is often a pain of the ass to scroll back and find anything in an extremely long list, especially when they are seriously working. In the age of pagination it was still possible to open the list in a second browser tab and look there for the second result, which is not possible any longer, at least not without starting over all again in the new tab.
The programmers of web pages have obviously forgotten that there were reasons why mankind abandoned scrolls many centuries ago and switched to books.
Mobile usability disasters
The lists of mistakes to avoid on a mobile web page commonly include trivialities such as "too many navigation items" or "too small links". But there are some things on many mobile pages that I personally find much more annoying.
Many websites are just too small on my 4.6'' smartphone. If it is a mobile version, website authors usually take care that font sizes are legible. But images are very often too small to recognize essential details. So I have to pinch and zoom on about every second mobile page that I visit. But some websites disallow pinch and zoom for reasons that I don't understand. I fail to see the rationale why it is technically possible to disable it in the first place. Well, after some time I discovered that the mobile versions of Chrome or Firefox can override the anti-zoom setting. "Zoom always possible" should be the default setting on any browser to end this patronization once and for all!
It is a rather new annoyance that a couple of mobile sites, and in particular news portals, jump to the index page once I scroll down to the bottom of a page. I don't know who possibly asked for such a behavior. When I scroll to the bottom of a page, in particular on a news portal, it is because I want to see the links to the categories, or the comment section. Or perhaps I just wish to check the conclusion of the article, rather than read it all. News websites also frequently bug me with messages such as "Our home page was updated. Do you want to go there?" There are so many other ways to get back to the index page, and so few to see anything besides the current headlines (and the ads, of course). So please end this madness and let me scroll to where I want!
Some of the problems of mobile websites seem to me like a decent hint to get the app, instead of using the mobile browser. In the less decent cases I get big messages such as "Our mobile site sucks, wouldn't you rather use our app?" each time I visit a site. Well, not quite phrased like that but that is the impression I get. In some extreme cases, on Android certain links to websites even open the Google Play Store and compel me to install the app, instead of taking me to the mobile website!
I understand the desire to avert unwarranted distribution of copyrighted media, or at least to make any duplication as hard as possible. But the need to stream videos instead of downloading them may be a nuisance for the user (and it still doesn't prevent computer buffs from ripping streams). Most importantly, streams are a huge and continually increasing waste of bandwidth. This may be ultimately at the expense of HTML, e-mail or download traffic, all of which I rate as much more valuable than a transitory video.
Streaming effectively excludes everyone with not quite as fast a connection from accessing online media in the first place. It almost seems like video portals are cooperating with providers of DSL or mobile internet, in order to allow them to sell always broader bandwidths. Case in point: UMTS is usually not sufficient for streaming, so it fuels the development of LTE. But even with fast internet, streaming never worked well enough for me, and perhaps never will. Even if it happens "only" once every two minutes, it annoys the hell out of me when the stream and briefly later the playback hangs, making it inferior to analog radio or TV broadcasts that have been working smoothly for almost a century. But the providers of online media, among them TV stations whose conventional broadcasts are doing fine, apparently expect me to put up with such a disrupted "viewing experience" when watching a whole movie or episode as a stream. No thanks. And when I try to load a video stream in the background while doing something useful in the meantime, it happens quite often that the video is gone when I return and I have to load everything again (speaking of wasted bandwidth).
Well, after I had given up on listening to web radio in 2001 when I already had Mbit/s internet access at the university, I finally have a decent quality of audio streams in more recent years. But I have no hope for the streams of video portals or TV stations to meet my expectations of adequate online videos any time soon - which would be a quality and reliability at least on par with old-fashioned analog TV.
CAPTCHA that I encounter. Am I just too stupid? No, that honor belongs to the people who conceived the CAPTCHAs. They rarely work the way they should for one or more of the following reasons:
- Characters are not recognizable at all because the image lacks the necessary contrast or because the font is awful.
- Numbers and letters look alike, such as "g" and "9".
- CAPTCHAs may be case-sensitive, but many capital and small letters look alike, such as "C" and "c".
- The orientation of the letters is not clear, so "z" may look like "N".
- With letters and numbers all over the place their correct order is not clear.
- Sometimes I have to solve an additional puzzle by only typing a certain selection of the barely visible characters.
- In some cases I suspect the displayed image does not even belong to the expected input.
- Occasionally the image does not load at all.
As a website owner I know too well how nasty spambots can be. But "enhancing" the CAPTCHAs by making them always more enigmatic riddles doesn't help. It only annoys the hell out of the human visitors. The much better option is to ask a simple question that a human being can easily answer. Since I switched to such a procedure, asking a simple question that every casual Trek fan can easily answer, there have never been any automated spam entries in my guestbook.
Overcrowded and ad-polluted portal websites
The portals of fully or partially online-based companies such as e-mail providers, other internet services, media companies, newspapers or TV stations are being visited by millions of people every day. While this should be an incentive for them to come up with a particularly decent looking, fast-loading, well-ordered and easy-to-use web design, their portals are very often exactly the contrary. There is usually no consistent site design, and sometimes no visible concept at all. Half of a page's real estate is often reserved for off-site ads. Many websites even give away their page background to ads, which then appears in colors such as pink or yellow (and is sometimes turned into a single big link, see above)! Much of the rest is spent for unrequested videos and Flash animations that waste valuable bandwidth and CPU power. The overall volume of a single page, including all media, is several megabytes, making a visit without DSL impossible (It is a fairy-tale that you just have to wait longer with a 56k modem or with GPRS. In reality your slow connection usually times out after two or three minutes, before a long page has loaded completely).
Anyway, while pop-ups are fortunately not so common any longer and can be suppressed most of the time, busy Flash ads or ad windows hovering above and/or moving across the page have become commonplace, and it is a pain of the ass to click them away - and to do so without accidentally clicking a link on the page below. There are no real menus, and important functions such as "sitemap", "search", "login", "help", "FAQ" or most obviously "contact" are obscured, making it blindingly obvious that they are not supposed to be found easily (see also "Hidden or cluttered navigation"). Sometimes direct navigation to content pages is not possible, but only via an intermediate page (with more ads, of course). Most portals look like porn sites these days, only without tits.
The rationale for the degradation of websites of online companies is evident, however. The sites were initially set up as a service for their customers, an online catalog/help desk, many of them as soon as in the mid-1990s. But now that the internet has become a mass market it is just too tempting to squeeze profit out of a site by filling it with ads and by offering paid "premium" services that very often have nothing to do with the original purpose of the website or even with the scope of the company itself. When I go to the site of a TV station, I would expect to find extensive information about their programs, such as transcripts, annotations, background information, production information or additional links. But it is a huge disappointment every time I look for such content. I am apparently asking too much, and I am rather expected to play browser games or buy apps on a TV website. And pertaining to the ads, it is peculiar how the companies (at least if they are not direct competitors) mutually pollute their websites!
The pleasant exception among the web-based companies shouldn't remain unmentioned: Google - a commercial site that does not only have a very simple and fast interface but also decent ways to integrate ads into the search results (well, Google's all-dominating AdSense service for other sites is another story). The same praise goes to eBay, although to lesser degree because it is not quite as clean any more.
Babylonian sites - Parlez-you deutsch?
For ten years I have been accustomed to speaking English whenever I am online, unless I am moving through a purely German realm like with eBay, other online shopping, my bank account etc. However, all sorts of sites force me to speak German even when I don't want to. They are "intelligent", analyze my IP and switch to a customized German version. Likewise, I was frequently presented a French version, because my former company's network hub is located in France.
Google is among the sites that insist on me speaking German or French - when I type "www.google.com" I wind up at "www.google.de", or "www.google.fr", respectively. This wouldn't bother me very much, if the search results were not customized as well, with German or French language results always coming first. But not only the ordering of the results is different. Google has separate databases on their different language-specific sub-sites. I may not even find some English-speaking results with the German or French Google. The only way to switch to the English default version is to click a small link at the bottom of the page, which then takes me to and keeps me at "google.com". Well, I can stay logged in, so Google will respect my language preference, but I don't really want them to know everything that I search for.
It's similar with many blogs and communities. On some blogs I am greeted with "Willkommen" because of my IP. This is fine as long as the German version is correct. But when I visit a blog of some American guy, and it shows the content in English, but the standard links, navigation and even his bio in German, it becomes bizarre. Especially since "Education: High School" is routinely translated to "Schulbildung: Gymnasium". But a German Gymnasium is only remotely comparable to a US high school (the better analogy would be "Gesamtschule"), and it must not be confused with a gym! Again, I could try to find the little link that may allow me to change my standard location, but I don't want to do that every time I visit any website, much less would I want to register at any place I visit, just to set my language and location preferences.
IMDb too recently exhibits a peculiar behavior. Depending on my IP, sometimes every bit of text, including the menus, is still in English, but the movie titles are in German. I can't find "Star Trek: First Contact" any longer, it is automatically converted to "Star Trek: Der erste Kontakt" for me! With a different IP I am welcomed to imdb.com in German, as it is customary elsewhere too, and I still find the English titles.
I am only glad that most sites don't attempt to automatically translate essential content with Google Translate to create total confusion... But this will be only a matter of time. On a funny note, eBay auto-translates some of the item description from foreign eBay sites as a "translation service without warranty". This includes translations of ebay.at items, from Austrian to German as it seems! :-)
It is obvious that above all the ads are increasingly "customized" for me. But I have the impression that the same applies to more and more of the content. And this is the worst part of the paternalism - it effectively re-establishes national and language borders as they should not exist in the internet in my opinion. If people in different countries are seeing different content at seemingly the same URL, it doesn't only lead to unnecessary confusion, it may even give rise to conflicts. And considering that I repeatedly had to explain to people that I couldn't watch certain videos at YouTube because they are blocked in Germany, who knows what content that I point visitors to may not be available in the USA.
Modern paternalism - We know better than you what you want!
In a similar fashion as automatically switching to the language that I am believed to speak, several websites claim to be more intelligent than I am. They anticipate what I am allegedly looking for. The first example has not yet become a commonly accepted technique, and I hope it will never happen. On some websites hosting blogs or articles it happens that words inside a paragraph are being underlined. But what appears to be a wiki-like keyword-sensitive linking to an encyclopedia entry or further information turns out to be a particularly insidious way to incorporate ads. Phew! Sites or hosts doing something disgusting like this should be boycotted.
A perhaps even worse example of websites that know better what I want than I do myself is the automatic correction of alleged misspellings in search results, including hits for search terms that have nothing to do with what I am looking for. Example: In order to stay informed about where EAS is being cited, I can do a simple web search, combining "ex astris" with various keywords. Google, however, gives me many search results with the term "Astrid" instead of "astris", without telling me that the search was extended. Google obviously assumes that I am looking for a girl named Astrid but I am too stupid to spell her name correctly. I don't know what the criteria are, because "astris" is not even an extremely rare word that might justify being "auto-corrected". Typing other unusual search terms (even things that have to be typos by all means) Google does its job the way I would expect and only suggests "Did you mean...?" when it suspects a misspelling, which is a lot more transparent and less patronizing. There is absolutely no need for auto-correction.
Finally, why is it that many websites refer me to "most popular topics"? This may be helpful on news sites, but it has become a common practice everywhere. At EAS, I'm trying to direct people to the best features in my humble opinion, especially when I think they deserve more attention than they usually receive. But EAS is not really made for people with "average interests" anyway. ;-)
Speaking of "most popular", eBay has jumped on the bandwagon and lists the "best matches" on top of the list, meaning the most frequently requested items or items of the top-rated commercial sellers. This is extremely unfair because sellers of less popular items or private sellers are getting even less exposure at eBay than before by being demoted in the search result order. And there are investigations that 80% of all users don't change the default search order. It is also extremely annoying for me as a "more demanding" user because each time, when the results are already being displayed, I have to reload the search with a useful order (such as by date or by price).
Deferred off-site links
I don't know if this annoys other people just as much as me, but I absolutely hate it when I click an off-site link, and instead of being taken directly to the destination I get an intermediate page with a friendly message "You are leaving..." and an advertisement. This habit was a temporary nuisance already around the year 2000, but it has resurfaced vehemently in more recent years.
Even some sites with a so far very good reputation such as Memory Alpha resort to showing an ad page after an off-site link has been clicked. This clearly has economic reasons and is imposed on the site by the mother company Wikia, rather than by the fans who maintain MA. Anyway, it can only be labeled as bad style. It slows down the performance perhaps even more than a pop-up banner would do. The companies that buy and sell ads on the internet don't ask whether their practices annoy the hell out of the visitors. They just try putting ads in every place where it is technically possible. But it remains to be seen whether the intermediate page acts as a subconscious barrier and makes people stay at MA, or whether it rather damages the site's reputation.
An almost cynical variant of the intermediate page showed up for some time (until about mid-2010) when someone clicked an off-site from Facebook. It read: "Be careful. For the safety and privacy of your Facebook account, remember to never enter your password unless you're on the real Facebook web site. Also be sure to only download software from sites you trust." Oh yes. The internet is so dangerous, you better stay in the cozy realm that is Facebook. -- Give me a break!!! Facebook is infamous for its own lack of data security, and they warn people when they try to leave the place?!
Another aspect to be taken into account is that because of the intermediate pages the mere existence of off-site links is concealed, and that they may become invisible to search engines. It depends on how it is done exactly, but a page whose outbound links are "protected" with intermediate pages may not contribute their due share of Google PageRank to the link target, while the page still gains PageRank from any incoming links. If this is true, the consequence is that pages of MA and Facebook climb up the list at Google at the expense of the linked pages. I can't verify if this is really the case though.
The trAPPed mobile user
As mentioned above, online marketing platforms bug their users to get their apps, rather than visiting their websites. I have to admit that a well-designed app can be a much more pleasant user experience than a mobile website.
But apps come with various pitfalls compared to mobile websites:
- Many apps leave out functions of the corresponding websites that I deem important. Especially the user management and the forms for posting or searching often lack functions that I am used to.
- No app that I have installed or at least tried has something like tabbed browsing. You are always stuck with one window, which makes such things as comparing items on a shopping app very hard.
- Getting data into and off apps can be hard too. The clipboard often doesn't work as desired. Sometimes it is not possible to copy or paste text at all.
- With apps, the content or service provider generally has more influence on what I see on the platform than in case I use a browser. Also, in the browser I can log out and additionally choose a "private" mode in which the website shows me the content in a more "neutral" fashion (well, not completely neutral, as long as I don't manage to conceal my location and my OS just as well).
Google, the self-declared website police
Since the early 2000s Google is the all-dominating market leader among the search engines. The visibility of a website in Google's search results is a key factor for the success of a site, which in many cases determines the commercial success of a company. Google has an air of being fair, but the monopolist creates and enforces its own rules that websites have to comply with. The user experience is supposed to improve by ranking sites higher that fulfill Google's standards for content and design. However, irrespective of the true quality of their sites, webmasters have to work a lot to fulfill Google's demands, and to keep their sites from vanishing somewhere beyond the fifth result page.
- Google's PageRank is a seemingly "democratic" way to rank sites based on their link popularity. But my impression is that the PageRank is manipulated. Google penalizes sites that have links to many other sites (imputing it is a discouraged SEO advertising scheme), while it seems to promote partners of Google's own advertising services.
- Google demands mobile compliance from all sites and lays down a large number of rules that sites have to fulfill. Although most smartphone users don't have a problem with pinch & zoom to read text on non-optimized pages, Google says that they will rank mobile usability even higher in the future (perhaps even higher than the content?), in which case many older sites will either have to be completely redesigned for thousands of dollars or will die.
- Google has started to enforce Google Web Light on search results, a service that is said to reduce the data volume on slow (mobile) internet connections, by passing pages through Google's servers in order to cut down their volume. I have multiple issues with this practice. Firstly, I don't want anyone to parse, filter, rebuild and serve my site. It is my creation, and no one but I myself is to decide how it is supposed to look. Secondly, I checked the look of my site once it is served by Google Web Light. It looks like a pile of shit, the layout is totally scrambled and for someone who does not understand that it's Google's fault it looks like I'm a complete idiot who doesn't know basic website coding. Thirdly, for someone who reassembles and serves my site it is easy to sneak in unwanted scripts such as for user tracking, or even ads. For all these reasons I don't want Google Web Light to touch my site. But the only way to avoid it would be to opt out using "Cache-Control: no-transform" in the page header, in which case my site would be punished for not allowing Google Web Light (or, as Google would put it, for "not being optimized for slow connections").
Mindless bookmarking and blog posts
I have put up with the features of the Web 2.0, many of which are very casual compared to what I see as the profundity of traditional websites. I am old-fashioned and I miss the days when website visitors actually recognized what was new and exciting and when they used to read and reflect on what someone was writing. Case in point: In early 2008 the EAS counter went up because of several blog, social networking and online bookmarking posts. The subject: "Starship Interiors", a page that had already been in existence at EAS for several years at that time. First off, I am glad that people still seem to care about my site, and now is as good a time as any, even if an old page is in the focus. But the latter is also a part of the problem of Web 2.0. Once it has entered the blogosphere, the old news gets reposted effortlessly, and it becomes something "new" (at until it has been shifted down to the 14th archive page, where it will be lost until it is old enough to be re-reposted).
And while I think that many of the people who clicked the "Starship Interiors" link did have a closer look at the content, many of those who felt compelled to comment did not mind the actual topic. It seems that they took for granted that that I created those cross-section views of the ship interiors myself, although the actual source is credited beneath every single thumbnail. Also, several people complained "So many images, but none from the Enterprise-D?" They didn't even bother going one page level up to look for the TNG page. It appears they don't know or don't care any longer how a website such as EAS works - a real website with a navigation structure, not a blog with overall randomly selected half-baked stuff.
A more recent example of the "old news still is news if we desperately need news" trend of the internet is Gene Roddenberry's original pitch for Star Trek that "resurfaced" in February 2011, at least if we believe the blogger who posted the PDF file. The very same file has been available at EAS all the time since 2006. (The pitch was originally posted at trek5.com from where I rescued it when the site went down. A newer Flash version can be found at Star Trek History.) I don't claim ownership of a file that I simply reassembled and I don't mind people taking it for their blog from EAS, even if they neglect to give me proper credit. But the old news got reposted, and not once with a simple check where it actually came from and whether it was really news, in hundreds(!) of blogs and on all the major Trek news sites. In light of the fact that something found in an obscure blog is hyped in the blogosphere as well as in fan circles, rather than the same content that has been on a major Trek site for years (and linked from Memory Alpha and Wikipedia, by the way), I feel let down by the Web 2.0.
People around the world enjoy Star Trek, and most of them are no ardent fans that talk about Trek-related topics every day. But many casual viewers feel a need to clarify that there is a great difference between them, the "cool" people who only watch it, and the "trekkies" who care way too much for it. In their eyes, "trekkies" are losers that run around in silly costumes or autists that tend to totally unimportant things all day. Especially in the wake of the reboot with its mainstream appeal the trend of nerd-bashing has become worse. Even within the Trek community there is now a certain flavor of fans who have sided with the "cool kids", with the ones who permit no doubt about the new Abrams style being superior to the old nerdy Trek and who accuse nitpickers of ruining the fun of watching Star Trek.
A symptomatic situation on a movie or science fiction message board, in a bookmarking community or a social network is that someone mentions that starship scales in a Trek movie are inconsistent. Several other members, who are often surprisingly (and suspiciously) knowledgeable for self-confessed non-nerds, give their two cents on the particular scene or on scaling issues in general. So far the geeky stuff is still considered cool enough to be discussed. But then someone posts a link to EAS where the very problem is analyzed in detail.
I really don't expect people to be grateful for my work. Actually, I don't mind if they find misinterpretations or errors in my research and criticize me for it, because EAS lives from continuous improvement. And even if they ignore the link to my site and carry on regardless, it is okay with me. But it happens frequently that someone starts a slur campaign against me and several others chime in. A campaign in which the posters spread lies and misinformation about my site and my person, sometimes coming from people who claim to know me. A campaign in which the posters insinuate I suffer from mental disorders and in which "measly basement dweller" or "sperg" (which I learned is short for "Asperger" in antisocial circles) are still among the rather harmless comments. In some communities it seems to be a common practice that, after a few pages of decent discussion, the participants feel an urge to bully fans who are arguably still nerdier.
People such as those forumites are clearly only anxious to dissociate themselves from "nerds" and particularly from the worst category of nerds, the "trekkies". And while they put down the diligent and dedicated work by true fans like me, they themselves create nothing but hot air. I have no problem at all with people hanging around in discussion forums or social networks just for fun, but only as long as they don't do it at the expense of other people. And while non-fans who ridicule "trekkies" are bad enough, there couldn't possibly be anything more hypocritical than Star Trek fans disparaging other fans. Irrespective of what these people are like in real life, I find it very pitiful that they need to practice cyberbullying to feel good.
Overzealous spam watch or netiquette in forums
As someone who runs or moderates several online communities, conventional bulletin boards as well as social network pages, I am well aware of the existence of trolls, spammers or bullies. I never had many problems with such people though, and especially the social media are a friendly place where little moderation is necessary, if any. In this light it is paradoxical and very unsettling that a number of online communities have labeled *me* as a spammer or troll. Here are three cases:
- I used to frequent big Star Trek news blogs/forums with usually controversial discussions, and for all I know I was still among the rather moderate commenters. But for some reason some of my comments were never admitted or were lost without a trace after posting. Sure, this may have technical reasons or my posts may have been deleted accidentally. But as much more inflammatory comments by others are still posted, I really wonder whether someone wanted to get rid of a critical voice (see "Trekkie-bashing" above).
- Disqus is a commenting system that I added to EAS for users to give their two cents on my articles. There are not many comments (less than one per day on average), but the comments have always been civil so far, with no need for me to moderate anything. That's the situation on EAS. Whenever I myself commented somewhere else, my own comments repeatedly got flagged as offensive or got tagged as spam. This happened on other sites using Disqus for commenting, as well as in the Disqus support forum. And in the latter case there is no reason to assume that someone wants to punish me for my unpopular takes on the Abramsverse or for my criticism of "Shatner in the next movie" speculation. Among hundreds of recent posts in the support forum it was my two requests that were marked as spam! I have no idea what I did wrong because the wording of the comments was not anything like spam, and wasn't in any other way offensive. In addition, my complaint about the unjustified stigmatization was deleted. There are apparently more than just one or two people (or bots) who hate me. For more than just one reason. But for reasons that I'm totally unaware of.
- While no one explicitly called me a spammer in this case, there is a German message board that I visit occasionally where my style of posting was repeatedly called "impolite". You should know that this message board (that has nothing to do with Star Trek, by the way) has a special netiquette. You have to begin the post with something like "Dear forum members" and close it with "Best regards, REAL NAME". My "mistake" was that I forgot to include these two lines and that I did not want to reveal my real name (which is *not* a requirement but just a "recommendation" by the moderators). The result was that most of the responses did not address the topic I wanted to discuss but were friendly reminders such as "I would give you feedback if you were more friendly". Agreed, this is an extreme case of a very special message board, but perhaps it is part of my problem.
Summarizing, I am sure that I am very kind to other people who have requests, as I can tell from the overwhelmingly positive reactions at EAS, in the social media and especially in follow-up e-mails from people with whom I discuss Star Trek stuff. There are two explanations for my frequent failure to request something from other people, in communities that have nothing to do with EAS:
- I am actually a rude person and not aware that I'm offending other people.
- Few other people are as kind and patient as I am myself when it comes to tending to requests or to simply listening.
Wikipedia and the arrogance of power
I used to like Wikipedia very much. It was and still is one of the few sites that I visit almost every day, a site where I can spend hours and always discover something new. I consult Wikipedia very often for my research in various fields. I learned a lot on subjects that I barely knew anything about before Wikipedia. I still appreciate very much that some people dedicate much of their spare time to extend the site, to keep it accurate and up-to-date.
But many of the people in charge at Wikipedia abuse their power. They thwart necessary corrections or additions to the content. They treat contributors, often experts in their fields, with disrespect, hiding behind narrow-minded policies. In several fields where I am an expert (not only Star Trek) the Wikipedia articles are inaccurate, inconsequential and self-important, and I can do nothing about it because the bureaucrats discourage edits and don't even listen.
It is not just a problem of the Star Trek fandom, but I will pick it as an obvious example. It is dissatisfying that well-written articles about major Trek websites such as TrekBBS were deleted from Wikipedia and even Memory Alpha has been tagged for deletion for some time. All because of lacking "notability" under the debatable terms of Wikipedia that might permit the inclusion a local school magazine and perhaps a garage business but not of a renowned fan website with tens of thousands of worldwide users every day. On the other hand, the Star Fleet Universe, a gaming subculture that very few fans care for, was conceded a whole article series (which has been cut down in the meantime though). I'm not even complaining that an article about my own website, Ex Astris Scientia, was first rewritten to one about the Starfleet Academy motto of the same name and then merged with the one about the Starfleet Academy. The only sentence left about EAS was totally off-topic and was eventually removed too. EAS may not be "notable", but the deletion policy should remain consistent, which is not the case as long as other unofficial fan works still have their place at Wikipedia, not to mention many other fan sites outside Star Trek.
Something that offends me personally is that dozens of links from Wikipedia articles to EAS were deleted although they were perfectly fitting in the context, the best off-site references Wikipedia could possibly get, with much more detailed information on the very topic of the Wikipedia page. The cited reasons in these cases: "A personal website is not notable/reputable", "original research" or even "link spam". In fact, Wikipedia withholds important information and denies its users the opportunity to obtain a second opinion by not linking to EAS, the definite #1 website for canon starships, technology and continuity issues. On the other hand, Wikipedia absolutely loves to cite the ramblings about Star Trek by "reputable" critics or newspaper columnists, although these usually have neither special ties to nor sufficient knowledge of the franchise or the fandom to produce anything "notable".
Wikipedia gets the basic facts about Star Trek right, but does everything to keep the franchise and even more so the fandom small. Overall it is not a good Trek resource. On many pages Wikipedia gives mentions in novels or games the same weight as canon facts or simply omits the necessary distinction. The list of Starfleet ships classes, for instance, is a pile of crap (as of July 2015). Pages about the characters list non-canon second names from obscure and highly questionable sources. Some episode descriptions have note fields calling for citations in "reliable third-party publications" as if some newspaper critic were required to confirm canon dialogue! Well, Wikipedia doesn't want to be an encyclopedia of in-universe (canon or non-canon) facts of Star Trek and adds more weight to the real-world implications. Still, this doesn't justify the countless mix-ups of Star Trek facts that can be found at Wikipedia.
More generally, Wikipedia increasingly suffers from the following severe deficiencies:
- Wikipedia has an odd 20th century idea of "notability" and of the "reliability" of sources. Other websites, blogs or forums have no place in Wikipedia's view of the world. In Wikipedia's terminology, unlike newspapers with "professional journalists" they are not considered "reliable sources". The ironical (and maddening) thing is that "invalid" sources (such as EAS) are often where the bulk of information in Wikipedia articles is taken from, for which they receive no credit!
- Wikipedia prefers renarration ("reliable third-party sources") over genuine competence. It deters scholars and experts from contributing, not only because of the learning curve until their writing style would be fit for an edit not to be undone immediately (see below), but most of all because their knowledge is not asked for in the first place.
- Wikipedia deems adherence to formal criteria ("Manual of Style") more important than the correctness or the completeness of the facts. While this may not be chiefly the fault of the platform and of its principles, too many users indulge in hair-splitting. They often criticize or even undo useful edits with small stylistic weaknesses instead of fixing them.
- Wikipedia prefers incorrect or outdated information over "insufficiently cited" corrections. Overzealous users usually revert any updated or corrected facts if these come without "reliable" citations, even in cases where the old information was obviously wrong or didn't have any cited sources either. They usually don't perform a simple plausibility check but simply undo the edit.
- Wikipedia is ruled by a small number of full-time users and their bots. They appear to lie in ambush for edits by normal users that they feel compelled to revert, with the rationales mentioned above and often within seconds after the edits were submitted. They engage in edit wars over trivialities on recently changed pages. Other parts of Wikipedia, irrespective of the topic, remain unattended for years, even if the articles are obviously poorly written or an edit was requested. If admins gave all pages the same treatment as the ones they discover as a prime target, especially regarding "notability" and "reliability of sources", I am certain they would have to remove 98% of Wikipedia's content.
- Wikipedia admins and other privileged users treat newbies with disrespect, which is extremely discouraging, especially for people who would have more to contribute than citations or stylistic corrections in existing articles. It has become a club of full-time bureaucrats who love to cite from thousands of rules and who leave no leeway for people who have to work for their living and who are on Wikipedia because of their interest in certain subjects or just for fun.
- Wikipedia is self-contained and self-centered. External links are effectively banned from Wikipedia articles (because almost no website is deemed "notable"), unlike it is good practice everywhere else in the internet. In other words, Wikipedia does not want readers to leave the site, or to obtain a second opinion elsewhere.
- Wikipedia imposes lots of formal rules on its members but no responsibility for the correctness, completeness or coherence of articles. In fact, the edits done by overzealous bureaucrats in some articles qualifies as vandalism. They remove huge chunks of useful information or revert corrections, and they very often post signs saying that references are missing, that the article sounds like advertising or that the content is disputable. While in most cases this assessment may be correct according to Wikipedia's own rules, it damages the reputation of the people, organizations or companies that the articles are about.
- Wikipedia articles are subject to favoritism. It is only natural that articles about important topics are longer and are edited more frequently than others. But in addition to this, there are certain popular people, companies and other topics whose articles enjoy a preferred treatment regarding the acceptance of updates and corrections and a lower threshold regarding the quality of new sources. There are often far more proponents than opponents of an update or expansion. The opposite applies to less popular articles as already explained above. Even if an expansion of a "stub" was explicitly requested, very often all editing attempts get reverted. Wikipedia is *not* neutral but favors those topics that are considered particularly important by the community.
- Wikipedia says its is "the free encyclopedia that anyone can edit." The latter part is untrue. Just like "This article is a stub. You can help Wikipedia by expanding it." I made perhaps 40 to 50 edits to articles (not only Star Trek) in my Wikipedia life, almost all of which were reverted after a short time, for one or more of the above reasons. There's no reason why I should still bother, and Wikipedia itself should finally acknowledge that it does not allow people to edit.
For all the reasons mentioned above, Wikipedia is definitely the wrong place to look for reliable Star Trek facts. That honor belongs to Memory Alpha, although especially in the wake of the latest movie it too has developed a bit of a "better than thou" attitude towards "biased" fans and their "insufficient research". Overall, the way that Wikipedia and, to lesser extent, Memory Alpha deals with the fandom is another aspect of the general trend that the operation of personal websites and blogs is increasingly obstructed and their mere existence is ignored. It is obvious that no profit can be made in any form of alliance with sites such as EAS, which makes them irrelevant in today's business-driven internet. But I can't help the impression that this stance rubs off on an increasing number of private internet users and on Wikipedia as a non-commercial platform, who may have come to think that a place that doesn't sell anything and has no ads can be no good. Fan sites like EAS that are not organized in a seemingly "democratic" fashion but are maintained by single individuals are additionally under suspicion of being inappropriately biased and intrinsically incorrect, as if online collaboration could eliminate personal opinions and flawed concepts a priori.
There is another, albeit minor source of sorrow pertaining Wikipedia. Minor, however, only because EAS is non-commercial and I don't need people who click my link for a living. Wikipedia has enforced the automatic addition of <rel="nofollow"> to any outbound link, which is a questionable SEO technique, with the same effect as the other techniques of deferred linking mentioned above. That way Wikipedia does not leak Google PageRank, while it gains PageRank from any incoming links (such as from EAS) that are usually not protected. The consequence is that Wikipedia pages climb up the list at Google. Indeed many Wikipedia articles have much higher PageRanks than the index pages of whole major websites dedicated to the very same topic and with the same keyword in the title. Those Wikipedia pages appear first in Google's search results, although they are just sub-pages. There are many factors that contribute to Google rankings, but Wikipedia's link policy certainly helps to stay on the top. Maybe I should be even glad that there is no article on EAS at Wikipedia any longer...
I am not a big fan of the so-called Web 2.0 because I was quite content with the conventional internet as a platform for companies, institutions and individuals to present their products, services, research or opinions. I certainly appreciate the new possibilities of online shopping and banking, of online collaboration and photo/video sharing. There are a couple of blogs I like, although it is my firm opinion that hand-made sites are superior. But I never asked for something like social networking, link sharing and microblogging, because it appears to me as a poor surrogate for traditional websites and discussion forums and ultimately as a poor imitation of life itself. I don't want to boast a list of online "friends" or "followers", even though there is a chance that a few of them may become true friends. I don't want to share funny pictures and videos with my soial network "friends", because I found it embarrassing already in 90s when e-mail was the only way to do it.
I am willing to give the "social web" a chance though. I am creating RSS feeds with EAS site updates on a regular basis since 2007. As I am running a hand-made non-CMS website, I have to write the source code of the feed in a text editor. While the feeds of blogs, forums and other Web 2.0 sites are created automatically, no one ever bothered to provide a software or script that could enable people like me to convert HTML to feeds or back on the fly. Still, every website with a certain reputation is expected to have a feed. I take the effort of creating a feed because I know that some regular visitors like to have all news sources in a feed reader, although RSS feeds are on a decline (see below).
In 2009 I signed up with Twitter because some other notable Trek sites were on Twitter too. I didn't expect much, but what I found there was arguably the most underwhelming internet innovation of all times.
The most evident drawback of Twitter is the posting limit of 140 characters. What can I say with just 140 characters? Sorry, that is not sufficient for me. Not by a long shot. I understand that the posting limit allows the tweet to be sent as a text message from and to a mobile phone. But it is the 21st century, we have got smartphones and the mobile internet is becoming affordable, with the possibility to switch to e-mail instead of SMS. And while I see no reason to use SMS for my tweets, I don't see the need to receive a text message from Twitter either. Nothing broadcast via Twitter could possibly be so important that I would want to be informed via text message on my way home. Home to my ultrafast DSL flatrate where I have the whole world on my fingertips, a world that I don't need to and I don't want to be dumbed down to 140-character posts. Overall, the social networks have totally lost perspective of the amount of information that is useful for a purpose and in a given medium. On one occasion they bombard me with dozens of Flash videos on a single page without asking, another time the "news" that people urge me to read at Twitter consists of a few inane words.
To most people it may be a minor nuisance, but the necessity to shorten links in a tweet is another thing that puts me off. For me as an internet literate it is important to see in advance where a link will lead me: to a news site, to a personal website, to yet another social network? Or to shameless spam, to porn, to Nazi propaganda, to malware? Provided that the redirect works at all (which is not always the case), clicking the obfuscated link can become a very unpleasant surprise.
The thing about Twitter that puts me off most is the total absence of a logical structure and the totally missing coherence of the tweets. Twitter may be the easiest way to spread the word about serious issues such as protests against dictators, but even this usually vanishes in the white noise of the mindless omnidirectional chatter. In conventional message boards there are topics, and off-topic posts are commonly scorned for very good reasons. In Twitter everything is off-topic by default. I can still try to trace who replied to what, but it requires me to follow link after link, and trying to find the original tweet is a bottomless pit in most cases. It feels like I'm thrown back into the internet stone age. So I often can't tell what two or more people are talking about (and perhaps they don't know themselves?). I was following the tweets of two Trek actors for some time, but I could not make any sense of most of what they were writing, what were the matters on which they replied to other users and to which users they would reply at all. Twitter is a totally chaotic chatter that defies any attempts to order your own trains of thought and your conversations, much less to create something of substance or to really get to know other people.
Following the example of other big Trek websites, I tried to post at least my site updates on Twitter. I expected that could be done automatically, because I already had the hand-made RSS feed as some sort of admission ticket to the Web 2.0. But in order to get third-party tools such as Twitterfeed, dlvr.it or Feedburner to work, I need to include additional fields (compliant with RSS 2.0), which means even more hand-editing for me. As one of the very last dinosaurs who are running a static website there will be a point in the no-so-far future when I have to concede that I can't keep up the visibility of the site in the world of social networking any longer.
I'm sorry if my rather low opinion on Twitter offends the people who are obviously having fun there. Twitter may be a good platform for casual discussions that wouldn't be possible elsewhere. It may be great for actors and other celebrities or for people who just imagine they are famous to gather followers who hang on their lips and click every ad they are posting. But its basic principle of 140 characters per post is in strong contrast to my idea of electronic communication, and since I have to take considerable efforts just to re-re-post my own site updates, it is rather a duty than a pleasure for me to be on Twitter.
Visit us on
My rant about social media wouldn't be complete without mentioning Facebook. Yes, I'm on Facebook too. But just like with Twitter I'm not there in the first place because it is so much fun to repost my site updates there. It is primarily because people expect EAS, as a high-profile site that caters to the Trek community, to be on Facebook. To put it drastically, if you're not on Facebook, you don't exist.
I personally see Facebook as a sometimes more and sometimes less useful supplement to the main site. Not as a place for me to hang around. It is something between a better guestbook and a platform for announcements, discussions and fun posts that are too small for an EAS article and too big for the index page of EAS. And while it means additional work it sometimes really pays to have the support and the feedback from the great people that are on Facebook.
However, unlike EAS, an ever increasing number of celebrities, other individuals, big companies and small businesses alike, as well as TV stations are overdoing their Facebook activities. They urge, almost coerce their fans, customers or viewers to visit their Facebook pages, rather than their genuine websites or blogs. "Visit us on Facebook" - this phrase is all-pervading on TV, in print media and, of course, in the internet itself. Many company websites boast Facebook logos the same order of size as their own logos. It sounds as if their Facebook page were the only thing that mattered, the only way to get in touch with a company, at least the only officially endorsed way. Many companies offer feedback and other services exclusively to Facebook members. Some have even stopped updating their website or have abandoned it altogether in favor of Facebook. Websites that often used to be well-maintained, with a wealth of well-ordered content and with a distinct design. In other words, with everything that Facebook doesn't offer.
Well, posting at Facebook is a lot more convenient and less time-consuming than maintaining a fully-fledged website or only a blog. You don't have to care about the design and the structure, much less about the coding and server maintenance. However, it is a shame how individuality is increasingly abandoned in the age of social media, and how it makes way for a conformist world where everything looks and works alike and everyone has to become a member to participate.
While there are a few options to enhance Facebook pages by adding apps (that include content from other social media, for instance), Facebook doesn't offer any kind of directory of apps that they approve of. The so-called "App Center" of Facebook lists only the following types of apps: games, games, games, shopping, games, and more games. I actually have to go to Google to find useful apps for my Facebook page! (But speaking of Google, Google+ seems to allow no apps at all to make it more versatile or more accessible.)
I personally visit Facebook only to check my EAS fan page. And lately I take care to log out after each visit, because I neither want Facebook to know where else I am going, nor do I have a desire to see "Like it" buttons and other plug-ins everywhere that urge me to vote for or comment on something and to make that visible for everyone in the world, increasingly with annoying pop-up windows. And since I respect the privacy of my visitors just as well, I promise that I will never install any Facebook (or Google+ or Twitter) plug-ins at EAS.
APIs and the demise of RSS
Posting content across multiple platforms is a big topic in the Web 2.0. So far RSS (Rich Site Summary or Really Simple Syndication) has been very helpful in sharing the content of a website or blog, but also of Facebook, Twitter, YouTube or other social media. Importing an RSS feed to a reader is very easy, and although it is not possible to parse it without additional tools such as a server-side PHP script or the online service Feedburner, it is also the method of choice of bringing news from around the Web 2.0 to websites such as EAS. The Trek Feeds page at EAS is composed of RSS feeds from several other websites.
But RSS (or the alternative format Atom) is on a decline:
- Twitter terminated the RSS feed for the user timelines in 2013.
- The built-in feed for a user's or a page's Facebook wall was extremely poorly maintained as long as I used it. For over one year, the shared links in the feed looked like this: "l.php?u=http://truelink.com". In other words, a relative link to a script on Facebook's server that redirects to the true URL. This relative link would only work from Facebook but not from other places (which is what RSS was meant for after all)! The only way to make Facebook's RSS feed work was to program the parser to extract the true link "http://truelink.com" from the mess. In July 2014, the links were "fixed" to absolute links, but it still wasn't possible to click them when posted anywhere else, because they still pointed to the Facebook server, where they produced a warning before the true link could be accessed! But even worse, with many tools, including the script on my server, it wasn't possible to even access the reworked feed, because it used HTTP Secure. Facebook evidently disregarded essential rules that were nailed down for feeds for a good reaason: to make them accessible for anyone from any place and with any device. Well, all that doesn't matter any longer because since 2015 Facebook's feed is gone too.
- Feedburner, a major tool for webmasters and content providers to process feeds, has not been taken care of in a long time. It will likely stop working soon, considering that it is owned by Google, and Google concentrates all of its social media developments in Google+.
RSS is dead and buried as far as the big social networks are concerned, for obvious reasons. It's not real time, it can't be tracked, it can't be customized with ads. And it is only a matter of time until other sites and blogging platforms will follow and will share information only via social network APIs and no longer with each other. Former publishers of feeds are forced to work with the various APIs (Application Programming Interface) of the social platforms if they still want to include off-site content in some fashion. APIs, however, are a trillion times more complex than feeds and can only be handled by experienced programmers. Even worse, APIs are not a standard such as RSS. They are frequently "upgraded" by the providers and hence require permanent maintenance also on the client side. Whereas big companies have whole departments of programmers to upgrade their own APIs and to work with the APIs of other platforms, there is no chance to keep up for webmasters who are doing their own thing. At some point in the future sites like EAS will be excluded from social media content.
Well, I could still use the ready-made badges or widgets by Twitter, Facebook and Google+ to incorporate their content at EAS. But I will never do that, because I refuse to post code that I can't control and that sniffs out my visitors' browsing behavior. Moreover, the widgets are extremely ugly and it is impossible to customize them in a way that they blend in.
I spend several hours per month already now, only to test and implement limited social media integration at EAS, and for the maintenance of the various services that are involved. At some time I may have to stop posting any off-site content.
All your header are belong to us!
As already mentioned in my above paragraph, it is becoming always more difficult to embed social media content on a web page because the social networks replace simple RSS with outrageously complex and ever changing APIs. The same can be said about the opposite direction, the sharing of pages of conventional websites in the social media, and the search engine optimization. The number of obstacles and pitfalls for webmasters who want their content to be found and shared is growing rapidly, not to mention the extra work that is necessary to make a web page ready for different social media and search engines.
When someone shares a link on Facebook, the API tries to grab up to three images from that page to select from to illustrate the post, or so it says. Actually the API skips images smaller than a minimum size that is probably around 300 pixels width; likewise it ignores links to bigger images. For example, whenever someone posts a link to one of the many EAS gallery pages that are full of images accessible via thumbnails, Facebook will not offer any of them for posting, neither the thumbnails nor the linked large images. Google+ still allows small thumbnails as of 2016. Facebook, in contrast, usually selects the EAS header image, which is obviously large enough not to slip the net with its width of 400 pixels, and the posts look accordingly indefinite and ugly. I am opposed to posting full-size images that will sum up to several megabytes on a gallery page, and the current thumbnail size of 267x200 pixels is already the upper limit to what is technically possible (regarding the page layout and the data volume). What is so hard about crawling image links anyway, especially since Facebook currently restricts the selection to only three? Effectively Facebook excludes many websites from showing rich content in shared links, although they do have this rich content.
Well, Facebook offers a possibility to define an image to be posted along with a shared link. But for that I would have to include an individual tag <meta property="og:image" content="http://www.ex-astris-scientia.org/dir/image_dir/individual_big_image.jpg" /> in the header of each single page! Whereas some people keep telling me to get a CMS to solve all my problems with accessibility and content sharing, the CMS wouldn't facilitate that extra work a bit (among the many other problems that a CMS simply can't solve).
The og:image meta tag is part of the Open Graph protocol that is said to simplify the inclusion of external content on Facebook. This may sound like a good idea because I *can* tell Facebook exactly how data from a web page should be handled. But in reality website owners *are forced* to tell Facebook how each of their pages should be handled in order to keep up the visibility of the posts and to prevent Facebook from misrepresenting the content. The quasi-obligatory og:image meta tag is just a first step to "educate" webmasters and CMS operators to include all kinds of repetitive or redundant information in their page headers, causing a huge amount of extra work - a special service for big companies that have whole departments for SEO and SMO (social media optimization) and a slap into the face for smaller businesses, not to mention people with personal websites.
Everything described above is just for Facebook. Open Graph is not a standard. While Google+ uses Open Graph too, the more important Twitter has its own proprietary meta tag system, Twitter Cards, that has to be included to each single page in addition to Facebook's Open Graph tags. Twitter has even gone one step further and significantly increases the visibility for tweets once it happily detects Twitter Card tags in the linked resources. It is obvious that a "Card", a link with an image, with a description of the linked page and with other extra information posted, will get a lot more attention than a simple 140-character tweet with a cryptical link. And this is only possible after a request to be whitelisted for Twitter Cards - a permission that may not be granted to or may be withdrawn from "substandard" sites like EAS is one in the world of social media.
And as if this all were not yet enough extra work, Google (the search engine) too demands a shitload of special meta information (AMP markup that is not compatible with Open Graph either) from a "quality page". In other words, from a page that gets displayed more prominently in search results irrespective of its true quality (i.e. the quality or relevancy of its content).
The screen resolutions and monitor dimensions are always growing. The last time I worked on a 640*480 pixel monitor was in 1996. But many websites, and particularly those of some big companies, have a fixed width of 800 or even 640 pixels, and are using the smallest possible font size that is barely readable. Moreover, on many news sites that squeeze the text into a narrow column between the navigation bar and the ads, I get only a small portion of an article at a time and I have to press "next" several times to carry on reading. While I can increase the font size for better accessibility (which inevitably ruins the layout especially if the text columns are narrow), there is no real way how I can tell the website to occupy a larger portion of my big screen (which is big for a reason after all). Some sites should really list a magnification glass among their viewing requirements.
I think that, rather than paying attention to antiquated monitors, the web designers simply don't know better but to nail down absolute dimensions for everything, especially since everything can be formatted so easily with CSS. I am fond of CSS as a great way to customize everything, but it encourages and sometimes forces web designers to abandon the art of adaptive layouts.
Update The font sizes are growing again, and there is an increasing trend to occupy the whole screen instead of a narrow column. I think the new trend was ironically fostered by the need to support mobile devices with their lower resolutions. Since the programmers needed to create a special CSS for the smaller screen sizes, they had to think of new concepts of adapting the page widths, which work just as well in the opposite direction and now allow full-screen layouts. Actually, the current trend for images as well as navigations elements is "the bigger, the better".
Why is it necessary to have images or even sub-pages open in new browser windows (with or without a redundant "close this window" button, and too often without a menu bar and location bar)? I talked to a few site designers several years ago, and I was told more than once that the average user would likely always stay in the same window and always click on links, never on the "back" button. What a crap! Because if the users were really that dull, there would be nothing more confusing to them than a maze of windows, especially if it's not clear how they are being controlled (like when you click on another image link on the page of origin, and you don't know whether the image opens another pop-up or refreshes the current one).
I concede that streaming may be an acceptable way to show small (and accordingly fast loading) clips embedded in websites, if only there were a reliable software to play them. But most streams force me to have Flash Player installed. I don't know if I'm the only one, but messages like these keep haunting me for over ten years: "Get Flash Player now" (when I have Flash Player but it was not installed for all browsers for some reason), "Please upgrade to the latest version" (when I already have the latest version) or "You need Flash Player 9 to play this video" (when I have version 10 installed).
The whole concept of Flash is flawed from the start, because it is overblown at the expense of usability. Well, Windows Media Player is overblown too, but at least there are lightweight alternatives that don't exist for the Flash Player plug-in. As already mentioned, if there were a lean video player to play streams (like with conventional TV that fundamentally needs no player software at all), the idea of online broadcasts wouldn't be all that bad. But Flash is much more than a simple player, and it does much more than becomes apparent to a normal user, such as circumventing the security and privacy settings of my browser or its plug-ins. Still, it doesn't offer any useful configuration options. And the use of Flash on websites is not restricted to playing videos or presenting truly interactive content (for which it can be beneficial in spite of everything). Many web designers just can't resist creating complete navigation structures with Flash, or they pack text into Flash images, which makes it totally inaccessible. And don't get me started on the dreadful Flash ads! Speaking of reduced usability, it is extremely frustrating how Flash disables the right-button menu and CTRL-LMB to open a new tab, throwing me back into the browser stone age (and forcing me to reload the same browser window all over again when going forth and back, speaking of wasted bandwidth).
But worst of all, nasty Flash applications frequently usurp 100% of my CPU power and make my PC or laptop hang, so the only remedy is to end Firefox by force. This may not be Adobe's fault in the first place, but of all applications that run on my computers, either stand-alone or as browser plug-ins, it is usually Flash that makes the system freeze, so there must be something genuinely wrong about it.
Update 1 I am used to trouble with Flash Player installations, but version 10.1 beats everything. When I go to the Adobe website as demanded, I first have to get the Adobe Download Manager (DLM) and, if I don't notice the little checkbox, McAfee Security Scan too. Great. Yet more parasitic programs sitting on my system. When I proceed with DLM and want to install Flash Player, I have to close Firefox, but nothing happens. After three or four times downloading the whole stuff in vain, I went to a website with a different prompt to install Flash directly through Firefox, and this time I was allowed to get Flash without the crappy DLM, and without even closing Firefox. Adobe once again remains truthful to their unofficial motto: "Why make it simple, when you can have it complicated?"
Update 2 I still don't like Flash, but for the past few years I experienced no major problems with it, except for occasional crashes.
I frankly concede that the performance of Google (and of other search engines likewise) has improved a lot in the past few years despite the ever rising data volume to be handled by their servers. It has always become harder to cheat and to get fraudulent content listed. However, there are still too many fake sites that slip the net and that receive average to good ranking from Google. The typical fake is an automatically generated so-called scraper site with a short and distinct static URL (insinuating rich content) and a common Google search term as its title. But looking closer, the "content" consists of nothing but a list of old search results, random news and ads that are keyword-driven and that may or may not be remotely related to the page title. Something like this is utterly useless and frustrating each time a user is looking for real information. I stumbled across pages where I could "Learn everything about Ex-Astris-Scientia.org" or even "Find the Lowest Prices for Ex-Astris-Scientia.org", as if my site were for sale!
Google usually deletes auto-generated pages from the index when they are discovered or reported. Many of them are parts of a link farm that has been set up to rank another page higher and are therefore banned from Google. I appreciate that. On the other hand, until 2012 Google even offered to build such sites with their AdSense for Domains program! In other words, Google helped to cheat unsuspecting visitors and helped to undermine its own index. I appreciate that they finally reconsidered their involvement in this immoral business. It is a pathetic pretext that "parked domains" will eventually be filled with content. If a parked domain is really meant to serve a certain company, the least they must do is announce the upcoming launch of the website instead of filling it with pointless ads that will deter visitors forever. For the same reason, a half-way respectable company would never register a domain that has been contaminated with a spam portal in its recent history and may even be much more expensive than a fresh one. Ad-polluted parked domains (many of which have been taken away from defaulting webmasters and its reputation ruined to embarrass them) are dead domains, and it is good that Google doesn't actively support this business any longer.
Another annoyance are ripped sites, especially those duplicating Wikipedia. While it may be a good idea to have more than one universal encyclopedia just to read a second opinion or to see a different approach, there are many wiki-based encyclopedias that exactly copy a portion of Wikipedia and are never updated. I can understand well if something like news is exchanged among different sites, but encyclopedic content should be off limits. I am not sure about the legal ramifications (Wikipedia may allow to copy their articles). In any case it is a total waste.
Update It is my impression that the number of scraper sites has decreased, or that Google has taken better measures to exclude them from the search index. In any case, it is not really an annoyance any more.
EAS on the Downturn - my thoughts on why EAS has lost half of its visitors and how to preserve what's left
Where Have All the Trek Sites Gone? - essay about the shrinking number of Trek websites