Archive for Web Design

Responsive vs. Adaptive Web Design


Adaptive vs. Full Responsive Website Design in Hospitality

Today, there are two prevailing “schools of thought” regarding website design and optimizing the hotel website for the multi-screen device world we live in: Responsive Website Design (RWD) and Adaptive Web Design (AWD). AWD is also known as Responsive Design on Server Side (RESS).

The main difference between RWD and Adaptive Design/RESS is the type of content as well as the manner in which web content is served on the different devices: desktop, mobile (smartphones) and tablet.

- Traditional Full Responsive Website Design (RWD) serves the full (and same) website across all devices (desktop, smartphone, tablet) by modifying and reshaping the exact same website so that it fits into all screen sizes: From large desktop screens to small smartphone screens. The intent is to optimize the “viewing” experience regardless of the device being used by the website visitor.

- Adaptive Web Design (AWD) aka Responsive Design on the Server Side (RESS) customizes website content and the overall user experience to the device (desktop, mobile, tablet) the website visitor is using. This is achieved from the same Content Management System (CMS), and ensures the maximum user experience, relevancy of information and conversions by device.

Here are some important factors to consider when deciding on Full Responsive vs. Adaptive/RESS Design:

A- Cost: a Full Responsive Design typically requires up to 50% more website design and development time compared to an Adaptive/RESS Design, which translates into higher website cost and longer delivery time.

B- Design: Full Responsive Design is practical for simpler, “thinner” websites (15-20 content pages) that require simpler design that can render well on all screen sizes and devices. For boutique, luxury, high-end or upscale hotels or resorts that want to a) stand out from the competition via custom website design (uniquely personalized to the property), and b) create an engaging and highly visual experience for the website visitor, then an Adaptive /RESS Design would be the preferred option.

C- User Experience: Fitting the same website into every possible screen size via Full Responsive Design may address the viewing experience across all devices, although it may not fully accommodate other best practices such as optimum user experience, relevancy of information, download speeds, etc.

D- Full Responsive Design may provide a sufficient user experience for smaller and simpler websites (e.g. a 15-20 page restaurant website, a select or limited-service property website, or a database-driven website), but may negatively affect hotel websites with deep content and rich imagery.

E- You cannot differentiate what content displays on the same page across different devices. For example, you cannot have one image on the desktop homepage and a different image on the mobile homepage.

F- A side effect of this is mobile load time. Each mobile page needs to load all of the content of a desktop site. Mobile devices are generally much slower than a full desktop computer so they will take longer to process all of the content.

G- Important to note – when creating a link to an external site such as a booking engine, the external site might not have the ability to redirect to mobile automatically. Normally, this would mean using a different link on the mobile page and a different link on the desktop page. However, with fully responsive sites, you may need to resort to including multiple links.

H- For websites with deep content and rich imagery, the user experience may be compromised on mobile and tablet devices. For example, The New York Palace’s desktop website features deep content + rich imagery with over 500 pages, files, folders, PDFs, images, etc. Trying to research and quickly make a booking on the iPhone 6′s 750×1334 screen (if utilizing Full Responsive design) would be difficult.

I- Download Speeds: Full Responsive designs typically suffer slower load time than Adaptive Designs because each page has additional script that is needed to fit the page for each particular screen size and device. Although not a significant issue for websites with minimal content (15-25 pages), deeper content websites with big imagery and lots of graphics may load at significantly lower speeds. Search engine rankings and conversions are increasingly dependent on fast load time speeds and travel consumers are less likely to book on websites with slow load time.

J- Relevancy of Information: There are differences in a user’s intent when they visit a hotel website on different devices. For instance, the always-on-the-go mobile traveler requires short, slimmed-down content with an emphasis on property location, area maps and directions, real time “smart rates” and availability, an easy-to-use mobile booking engine and a click-to-call property reservation number. Fully responsive websites do not let you display different content on mobile.

K- Conversions: Due to usability and security issues, six out of every ten mobile bookings actually happen via the voice channel. Very few smartphone users are comfortable entering their credit card information into their iPhone in a public place. Factors like relevancy of information, quick access to rates and availability or special offers and discounts are crucial for turning mobile lookers into bookers. Overwhelming a mobile user with the full desktop content of 100-300+ content pages of the desktop website means the user may not be able to quickly find the information they need in order to make a smart purchasing decision.

Digital Recommendations:

1- Choose Full Responsive Design (RWD) for websites that do not need more than 15-25 pages of content: restaurant websites, and websites for select-service to midscale-service properties. Design should be simpler and content should be kept minimal so the user experience is not compromised.

2- Choose Adaptive/RESS Design for premium, luxury, boutique, upscale and full-service properties, multi-property and brand websites with deeper content (26 plus content pages), extensive imagery, and a complex product.

19 percent of the web now runs on WordPress

SAN FRANCISCO — At the annual San Francisco WordCamp, WordPress creator Matt Mullenweg told the audience a fascinating stat about the service.


In a talk that also included details on the next two versions of WordPress, Mullenweg said, “We’re now up to 18.9 percent of the web running WordPress. … We’re going to see the number of people who have WordPress as part of their daily habits grow exponentially.”

Around 66 percent of those sites and blogs are in English. Monthly pageviews for all WordPress sites and blogs rose to a massive 4 billion in 2013.

Mullenweg also said around 30 percent of respondents in a recent survey from WP Engine were aware of WordPress as an entity or brand.

The service just celebrated its tenth anniversary in May, and parent company Automattic took a sizable $50 million funding round, also in May.

Read Full Story here >>

DCA opposes inclusion of “.africa” gTLD strings in list of reserved names

The DotConnectAfrica (DCA), one of the organizations that have expressed interest to implement and manage “.africa” gTLD, has opposed plans to include “DotAfrica,” “DotAfrique” and “DotAfriqiya” top level internet domains in the List of Reserved Names, a move that would make the strings unavailable during the ICANN’s new gTLD application process in February 2012.

In a commentary posted on the DCA website in reaction to an article published in ComputerWorld Kenya, DCA states that: “The proposal to include DotAfrica gTLD in the List of Reserved Names is a tactic to make this string and similar strings in any language to be unavailable in this ICANN gTLD round so as to give special legislative protection that will benefit the AU, and give it extraordinary powers to separately negotiate and delegate these names outside the ICANN programme.”

The DCA post in reaction to the article which DCA claims creates makes the impression that “the ministerial meeting agreed that the .africa gTLD should be reserved” which means“that organizations that want to bid to manage it must be sanctioned by the AU” and that “ICANN’s new gTLD application process provides for countries and regions with interest in certain names to reserve them.”

Read Full Story by Clicking here >>

Why You Should Not Rely on One Source of Web Traffic

Writing by Nick Stamoulis

White hat SEO dictates that you take a blended approach to your link building. A diverse and consistent link building campaign demonstrates to the search engines your commitment to branding your site and building your online presence. It is important to make sure that your site isn’t flagged for trying to spam or “cheat” the algorithm in order to artificially boost your own ranking. But that isn’t the only reason it is important to diversify your link building.
A diverse link building portfolio means you will always have a viable source of traffic

I’ve read several blog posts recommending that site owners do away with their sites entirely, and shift all their focus to social networking sites. After all, that is where your customers are! That’s what the people want! It’s the future of online marketing! All of those things may be true, but I would never recommend that a company delete their site in favor of a social profile.
Let’s say that Facebook, the megalith of social networking sites, disappeared tomorrow. I realize that this is highly unlikely, but it is still a possibility. Or let’s say that Google+ really is the “Facebook killer” some claim it has the potential to be and 90% of Facebook users migrate over to Google+. If you’re entire online marketing campaign centered on your Facebook profile, you no longer exist! I realize that this is a bit of an extreme example; I don’t think social media is going away any time soon, but you have to consider the possibility.

read more by clicking here >>

What cloud computing really means

Cloud computing is all the rage. “It’s become the phrase du jour,” says Gartner senior analyst Ben Pring, echoing many of his peers. The problem is that (as with Web 2.0) everyone seems to have a different definition.

As a metaphor for the Internet, “the cloud” is a familiar cliché, but when combined with “computing,” the meaning gets bigger and fuzzier. Some analysts and vendors define cloud computing narrowly as an updated version of utility computing: basically virtual servers available over the Internet. Others go very broad, arguing anything you consume outside the firewall is “in the cloud,” including conventional outsourcing.

Cloud computing comes into focus only when you think about what IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT’s existing capabilities.


Mozilla releases faster, more stable Firefox 6

08/17/2011 | 09:00 AM

Mozilla on Wednesday  released what it billed as a faster and more stable version of its Firefox Web browser, version 6.

One of the key changes in Firefox 6 was in the address bar, which now highlights the domain of the website a user is visiting, to thwart spoofing.

“The address bar now highlights the domain of the website you’re visiting,” it said in its release notes.

Mozilla also said it streamlined the look of the site identity block, and added support for the latest draft version of WebSockets with a prefixed API.

Support has also been added for EventSource and server-sent events, and for window.matchMedia.

It likewise added Scratchpad, an interactive JavaScript prototyping environment; and a new Web Developer menu item and moved development-related items into it.

Mozilla said the new Firefox boasts of “reduced browser startup time” when using Panorama, adding it has fixed several stability and security issues.

However, Mozilla also noted some issues in Firefox 6, including:

  • Arabic text on does not display correctly. The BBC has been notified of the issue.
  • For some users, scrolling in the main GMail window will be slower than usual.
  • Starting Firefox using a locked profile, may cause it to crash.
  • In Windows, some users with certain graphics cards and drivers may see a small rendering error on some websites, while some users of Adobe Reader X have experienced instability when viewing PDF documents in the browser. Mozilla recommended uninstalling and reinstalling Adobe Reader X.
  • In Mac OS X 10.7 (Lion), users may see a crash when the file chooser dialog is shown. Apple has been notified of the issue. Users running Lion are no longer able to use gestures to navigate. Mozilla said this will be fixed in a future release. Mozilla also said this version of Firefox will not work on Macintosh hardware with Power PC CPUs.
  • In Linux, the video control buttons may not work when viewing QuickTime videos with libtotem. Also, users compiling from source might need a newer gcc and libstdc++ as the build requirements have changed. — RSJ, GMA News

.Com and .Net Price Increases Announced

VeriSign [ ]   has announced its almost-annual price increases for .com and .net domain names.

The wholesale cost from VeriSign for .com domain names will increase from $7.34 to $7.85 on January 15, 2012 and the registry fee for .net domain names will increase from $4.65 to $5.11.

The VeriSign fee doesn’t include ICANN’s 18 cent fee per year. So the wholesale cost of a .com domain name will be $8.03 and a .net will be $5.29.

VeriSign just renewed its contract with ICANN to run .net. It allows VeriSign to continue jacking up .net prices 10% a year. ICANN didn’t provide an explanation for this arbitrary increase.

VeriSign’s press release about the price increase mentions the increasing load of DNS queries the company handle.

This increase has come about after an agreement was signed with ICANN, the organisation responsible for managing domain names on the internet. Of course, the price increases will be passed down to the clients of registration offices.

To justify this increase, Verisign indicates that they have been forced to increase their security due to multiple distributed denial of services attacks (DDoS). The company states that they recorded more than 57 billion domain lookup requests on their servers each day in the first quarter of 2011. At this time, there are 96 million .com web sites and 14 million .net websites present in the world.

Google experiments with Hotel Finder search tool

Google has been ramping up its transportation search features for desktop and mobile, but now it is shifting into full on travel mode with its Hotel Finder experiment.

The Next Web reports that the utility is “designed to help users find the perfect hotel.” Easier said than done, of course, but maybe something that Google creates is just crazy enough to work.

Google’s Hotel Finder (not to be confused with can find the ideal accommodation for a particular user based on a few different priorities, such as location and budget. For example, when searching for where to stay, the user can draw shapes around neighborhoods using a mouse rather than searching by individual addresses.


Read more >> :

This story originally appeared at ZDNet’s Between the Lines.

The future of IT jobs? It’s in three types of roles [ZDNet]

There’s a general anxiety that has settled over much of the IT profession in recent years. It’s a stark contrast to the situation just over a decade ago. At the end of the 1990s, IT pros were the belles of the ball. The IT labor shortage regularly made headlines and IT pros were able to command excellent salaries by getting training and certification, job hopping, and, in many cases, being the only qualified candidate for a key position in a thinly-stretched job market. At the time, IT was held up as one of the professions of the future, where more and more of the best jobs would be migrating as computer-automated processes replaced manual ones.

Unfortunately, that idea of the future has disappeared, or at least morphed into something much different.

The glory days when IT pros could name their ticket evaporated when the Y2K crisis passed and then the dot com implosion happened. Suddenly, companies didn’t need as many coders on staff. Suddenly, there were a lot fewer startups buying servers and hiring sysadmins to run them.

Around the same time, there was also a general backlash against IT in corporate America. Many companies had been throwing nearly-endless amounts of money at IT projects in the belief that tech was the answer to all problems. Because IT had driven major productivity improvements during the 1990s, a lot of companies over-invested in IT and tried to take it too far too fast. As a result, there were a lot of very large, very expensive IT projects that crashed and burned.

When the recession of 2001 hit, these massively overbuilt IT departments were huge targets for budget cuts and many of them got hit hard. As the recession dragged out in 2002 and 2003, IT pros mostly told each other that they needed to ride out the storm and that things would bounce back. But, a strange thing happened. IT budgets remained flat year after year. The rebound never happened.

Fast forward to 2011. Most IT departments are a shadow of their former selves. They’ve drastically reduced the number of tech support professionals, or outsourced the help desk entirely. They have a lot fewer administrators running around to manage the network and the servers, or they’ve outsourced much of the data center altogether. These were the jobs that were at the center of the IT pro boom in 1999. Today, they haven’t totally disappeared, but there certainly isn’t a shortage of available workers or a high demand for those skill sets.

That’s because the IT environment has changed dramatically. More and more of traditional software has moved to the web, or at least to internal servers and served through a web browser. Many technophobic Baby Boomers have left the workforce and been replaced by Millennials who not only don’t need as much tech support, but often want to choose their own equipment and view the IT department as an obstacle to productivity. In other words, today’s users don’t need as much help as they used to. Cynical IT pros will argue this until they are blue in the face, but it’s true. Most workers have now been using technology for a decade or more and have become more proficient than they were a decade ago. Plus, the software itself has gotten better. It’s still horribly imperfect, but it’s better.

Read More >> sold to PhotoBox for £120 million

Online greetings card retailer has been sold to online digital photo service PhotoBox for £120 million in a merger that will create one of Europe’s largest personal publishers, the companies have announced.

Nick Jenkins, the founder and chairman of Moonpig, is set for a multi-million pound windfall after agreeing to sell the business, including some of his 35 per cent stake, to PhotoBox. The former commodities trader at Glencore founded the business in 1999.

Jenkins plans to reinvest in the new business and will continue as an adviser to the merged company’s board of directors. He comments that the deal will enable Moonpig to enter new overseas markets and offer a wider range of products.

Under the terms of the £120 million purchase, Moonpig’s existing shareholders will roll over a portion of their holdings to the share capital of the new merged group. Existing shareholders including Highland Capital Partners, Index Ventures and Harbourvest have backed the deal, which also secured the support of a group of new private equity investors led by Insight Ventures, Quilvest Ventures and Greenspring Associates.