Archive for Web Design

19 percent of the web now runs on WordPress

SAN FRANCISCO — At the annual San Francisco WordCamp, WordPress creator Matt Mullenweg told the audience a fascinating stat about the service.

wordpress-10

In a talk that also included details on the next two versions of WordPress, Mullenweg said, “We’re now up to 18.9 percent of the web running WordPress. … We’re going to see the number of people who have WordPress as part of their daily habits grow exponentially.”

Around 66 percent of those sites and blogs are in English. Monthly pageviews for all WordPress sites and blogs rose to a massive 4 billion in 2013.

Mullenweg also said around 30 percent of respondents in a recent survey from WP Engine were aware of WordPress as an entity or brand.

The service just celebrated its tenth anniversary in May, and parent company Automattic took a sizable $50 million funding round, also in May.

Read Full Story here >>

DCA opposes inclusion of “.africa” gTLD strings in list of reserved names

The DotConnectAfrica (DCA), one of the organizations that have expressed interest to implement and manage “.africa” gTLD, has opposed plans to include “DotAfrica,” “DotAfrique” and “DotAfriqiya” top level internet domains in the List of Reserved Names, a move that would make the strings unavailable during the ICANN’s new gTLD application process in February 2012.

In a commentary posted on the DCA website in reaction to an article published in ComputerWorld Kenya, DCA states that: “The proposal to include DotAfrica gTLD in the List of Reserved Names is a tactic to make this string and similar strings in any language to be unavailable in this ICANN gTLD round so as to give special legislative protection that will benefit the AU, and give it extraordinary powers to separately negotiate and delegate these names outside the ICANN programme.”

The DCA post in reaction to the article which DCA claims creates makes the impression that “the ministerial meeting agreed that the .africa gTLD should be reserved” which means“that organizations that want to bid to manage it must be sanctioned by the AU” and that “ICANN’s new gTLD application process provides for countries and regions with interest in certain names to reserve them.”

Read Full Story by Clicking here >>

Why You Should Not Rely on One Source of Web Traffic

Writing by Nick Stamoulis

White hat SEO dictates that you take a blended approach to your link building. A diverse and consistent link building campaign demonstrates to the search engines your commitment to branding your site and building your online presence. It is important to make sure that your site isn’t flagged for trying to spam or “cheat” the algorithm in order to artificially boost your own ranking. But that isn’t the only reason it is important to diversify your link building.
A diverse link building portfolio means you will always have a viable source of traffic

I’ve read several blog posts recommending that site owners do away with their sites entirely, and shift all their focus to social networking sites. After all, that is where your customers are! That’s what the people want! It’s the future of online marketing! All of those things may be true, but I would never recommend that a company delete their site in favor of a social profile.
Let’s say that Facebook, the megalith of social networking sites, disappeared tomorrow. I realize that this is highly unlikely, but it is still a possibility. Or let’s say that Google+ really is the “Facebook killer” some claim it has the potential to be and 90% of Facebook users migrate over to Google+. If you’re entire online marketing campaign centered on your Facebook profile, you no longer exist! I realize that this is a bit of an extreme example; I don’t think social media is going away any time soon, but you have to consider the possibility.

read more by clicking here >>

What cloud computing really means

Cloud computing is all the rage. “It’s become the phrase du jour,” says Gartner senior analyst Ben Pring, echoing many of his peers. The problem is that (as with Web 2.0) everyone seems to have a different definition.


As a metaphor for the Internet, “the cloud” is a familiar cliché, but when combined with “computing,” the meaning gets bigger and fuzzier. Some analysts and vendors define cloud computing narrowly as an updated version of utility computing: basically virtual servers available over the Internet. Others go very broad, arguing anything you consume outside the firewall is “in the cloud,” including conventional outsourcing.

Cloud computing comes into focus only when you think about what IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT’s existing capabilities.

Read more on infoworld.com

Mozilla releases faster, more stable Firefox 6

08/17/2011 | 09:00 AM

Mozilla on Wednesday  released what it billed as a faster and more stable version of its Firefox Web browser, version 6.

One of the key changes in Firefox 6 was in the address bar, which now highlights the domain of the website a user is visiting, to thwart spoofing.

“The address bar now highlights the domain of the website you’re visiting,” it said in its release notes.

Mozilla also said it streamlined the look of the site identity block, and added support for the latest draft version of WebSockets with a prefixed API.

Support has also been added for EventSource and server-sent events, and for window.matchMedia.

It likewise added Scratchpad, an interactive JavaScript prototyping environment; and a new Web Developer menu item and moved development-related items into it.

Mozilla said the new Firefox boasts of “reduced browser startup time” when using Panorama, adding it has fixed several stability and security issues.

However, Mozilla also noted some issues in Firefox 6, including:

  • Arabic text on BBC.co.uk does not display correctly. The BBC has been notified of the issue.
  • For some users, scrolling in the main GMail window will be slower than usual.
  • Starting Firefox using a locked profile, may cause it to crash.
  • In Windows, some users with certain graphics cards and drivers may see a small rendering error on some websites, while some users of Adobe Reader X have experienced instability when viewing PDF documents in the browser. Mozilla recommended uninstalling and reinstalling Adobe Reader X.
  • In Mac OS X 10.7 (Lion), users may see a crash when the file chooser dialog is shown. Apple has been notified of the issue. Users running Lion are no longer able to use gestures to navigate. Mozilla said this will be fixed in a future release. Mozilla also said this version of Firefox will not work on Macintosh hardware with Power PC CPUs.
  • In Linux, the video control buttons may not work when viewing QuickTime videos with libtotem. Also, users compiling from source might need a newer gcc and libstdc++ as the build requirements have changed. — RSJ, GMA News

.Com and .Net Price Increases Announced

VeriSign [ http://www.verisign.com ]   has announced its almost-annual price increases for .com and .net domain names.

The wholesale cost from VeriSign for .com domain names will increase from $7.34 to $7.85 on January 15, 2012 and the registry fee for .net domain names will increase from $4.65 to $5.11.

The VeriSign fee doesn’t include ICANN’s 18 cent fee per year. So the wholesale cost of a .com domain name will be $8.03 and a .net will be $5.29.

VeriSign just renewed its contract with ICANN to run .net. It allows VeriSign to continue jacking up .net prices 10% a year. ICANN didn’t provide an explanation for this arbitrary increase.

VeriSign’s press release about the price increase mentions the increasing load of DNS queries the company handle.

This increase has come about after an agreement was signed with ICANN, the organisation responsible for managing domain names on the internet. Of course, the price increases will be passed down to the clients of registration offices.

To justify this increase, Verisign indicates that they have been forced to increase their security due to multiple distributed denial of services attacks (DDoS). The company states that they recorded more than 57 billion domain lookup requests on their servers each day in the first quarter of 2011. At this time, there are 96 million .com web sites and 14 million .net websites present in the world.

Google experiments with Hotel Finder search tool

Google has been ramping up its transportation search features for desktop and mobile, but now it is shifting into full on travel mode with its Hotel Finder experiment.

The Next Web reports that the utility is “designed to help users find the perfect hotel.” Easier said than done, of course, but maybe something that Google creates is just crazy enough to work.

Google’s Hotel Finder (not to be confused with HotelFinder.com) can find the ideal accommodation for a particular user based on a few different priorities, such as location and budget. For example, when searching for where to stay, the user can draw shapes around neighborhoods using a mouse rather than searching by individual addresses.

 

Read more >> :

This story originally appeared at ZDNet’s Between the Lines.

The future of IT jobs? It’s in three types of roles [ZDNet]

There’s a general anxiety that has settled over much of the IT profession in recent years. It’s a stark contrast to the situation just over a decade ago. At the end of the 1990s, IT pros were the belles of the ball. The IT labor shortage regularly made headlines and IT pros were able to command excellent salaries by getting training and certification, job hopping, and, in many cases, being the only qualified candidate for a key position in a thinly-stretched job market. At the time, IT was held up as one of the professions of the future, where more and more of the best jobs would be migrating as computer-automated processes replaced manual ones.

Unfortunately, that idea of the future has disappeared, or at least morphed into something much different.

The glory days when IT pros could name their ticket evaporated when the Y2K crisis passed and then the dot com implosion happened. Suddenly, companies didn’t need as many coders on staff. Suddenly, there were a lot fewer startups buying servers and hiring sysadmins to run them.

Around the same time, there was also a general backlash against IT in corporate America. Many companies had been throwing nearly-endless amounts of money at IT projects in the belief that tech was the answer to all problems. Because IT had driven major productivity improvements during the 1990s, a lot of companies over-invested in IT and tried to take it too far too fast. As a result, there were a lot of very large, very expensive IT projects that crashed and burned.

When the recession of 2001 hit, these massively overbuilt IT departments were huge targets for budget cuts and many of them got hit hard. As the recession dragged out in 2002 and 2003, IT pros mostly told each other that they needed to ride out the storm and that things would bounce back. But, a strange thing happened. IT budgets remained flat year after year. The rebound never happened.

Fast forward to 2011. Most IT departments are a shadow of their former selves. They’ve drastically reduced the number of tech support professionals, or outsourced the help desk entirely. They have a lot fewer administrators running around to manage the network and the servers, or they’ve outsourced much of the data center altogether. These were the jobs that were at the center of the IT pro boom in 1999. Today, they haven’t totally disappeared, but there certainly isn’t a shortage of available workers or a high demand for those skill sets.

That’s because the IT environment has changed dramatically. More and more of traditional software has moved to the web, or at least to internal servers and served through a web browser. Many technophobic Baby Boomers have left the workforce and been replaced by Millennials who not only don’t need as much tech support, but often want to choose their own equipment and view the IT department as an obstacle to productivity. In other words, today’s users don’t need as much help as they used to. Cynical IT pros will argue this until they are blue in the face, but it’s true. Most workers have now been using technology for a decade or more and have become more proficient than they were a decade ago. Plus, the software itself has gotten better. It’s still horribly imperfect, but it’s better.

Read More >>

Moonpig.com sold to PhotoBox for £120 million

Online greetings card retailer Moonpig.com has been sold to online digital photo service PhotoBox for £120 million in a merger that will create one of Europe’s largest personal publishers, the companies have announced.

Nick Jenkins, the founder and chairman of Moonpig, is set for a multi-million pound windfall after agreeing to sell the business, including some of his 35 per cent stake, to PhotoBox. The former commodities trader at Glencore founded the business in 1999.

Jenkins plans to reinvest in the new business and will continue as an adviser to the merged company’s board of directors. He comments that the deal will enable Moonpig to enter new overseas markets and offer a wider range of products.

Under the terms of the £120 million purchase, Moonpig’s existing shareholders will roll over a portion of their holdings to the share capital of the new merged group. Existing shareholders including Highland Capital Partners, Index Ventures and Harbourvest have backed the deal, which also secured the support of a group of new private equity investors led by Insight Ventures, Quilvest Ventures and Greenspring Associates.

Read More >>

Picking the Right Keywords Will Make or Break Your Website

SEO is critical to your website’s success, but if your site is targeting the wrong keywords, you are doomed for failure.
Keyword Research Background Info
Keyword research revolves around finding the perfect keywords (or key phrases which refers to the same thing as keywords) for your website to target. For example, one of the key phrases my website targets is “web maintenance Tanzania “. Some keywords are more competitive than others. You can tell how competitive a keyword is by doing a Google search. Searching the key phrase “web developer” in Google results in over 129,000,000 web pages!  (Searching with quotes around your key phrase tells Google to provide only web pages that have all the words that are within your quotes. If you don’t use quotes, Google will include results that only match part of the key phrase.) If I targeted my website for the key phrase “web developer”, I would be competing with 129,000,000 web pages and I would have no chance for success. Keyword research is picking the battles you want to fight and you only want to pick battles you can win which is why having a good keyword tool is so important.
Keyword Tools
There are tools that give you valuable insights on the number of searches a keyword recieves a month, where it receives those searches (in which country/state/city), and the competition in general. Using the right keyword tool when doing your keyword research is extremely important.
Keyword Strategies
There are many books written on ways to pick keywords and ways to use those keywords on your website. I’ve read a ton of books on the subject and have never really been impressed by what I’ve read. A couple weeks ago I stumbled upon 50 Keyword Strategies.