Online Stepping Stones

Affordable website solutions

Simple affordable web seo partners

If you wish to advertise on this page please click here and fill out the form provided, otherwise you can return to the partner's category page.

Thank you for visiting online stepping stones the provider of simple affordable web solutions and please enjoy the content and links below.

Seo partners, articles and solutions below

Online stepping stones - Related Articles and News

The provider of affordable website hosting presents - How Panda and Penguin Algorithms Negatively Affect Your Website

[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]

A top ranking on Google is a golden ticket to success. Companies spend fortunes on Search Engine Optimisation and Google advertising to raise awareness of their site, thus boosting their place on search results. Google is fully aware that companies are willing to do almost anything to get to number one on page one. So over the last few years, they have introduced increasingly stringent regulations to ensure that high-ranking websites are not only free of spam but also contain ‘high quality’ content.

While the delineation of quality might seem rather arbitrary, Google builds increasingly complex algorithms in order to ensure that it becomes ever more difficult to get a great ranking without spending some serious money on brand building. Google would perhaps prefer sites to spend money on their own advertising, so Google is increasingly clamping down on those wishing to get a keyword bump, creating an opaque situation that requires constant vigilance.

Google is, first and foremost, a profit-generating enterprise; and the company is second to none in that regard. Its business model from the offset has been to provide a high-quality product that seems simple to the outside world but which, obviously, reflects a highly complex algorithm under the hood – able to provide the most reliable and accurate results on a consistent basis. This simplicity of user-experience was evidenced in their meteoric rise to the top of the search engine world, eviscerating their competition in the process.

There’s not much you can Ask Jeeves these days; he has gone to the cyber afterlife due to Google’s peerless quality of search. In contrast, the very term ‘search’ has been replaced with the verb ‘to Google.’ It is certain that a company has gained ubiquity once its name shifts from a noun to a verb. The danger of such a shift is complacency, and Google has been very aware of this inherent danger in the ever changing world of tech; they have innovated in various areas, from their Android mobile Operating System to their Google Maps, Google Earth, and the multitude of G-products that we all use in daily life.

One of the major ways Google has been able to stay ahead of the game is by shifting and modifying the way they calculate the popularity of sites. The general user wouldn’t notice such a change. BBC, Microsoft, and other major corporate entities still dominate their realms as do other major niche providers, but how to decide on the popularity of a travel site, or a site selling sportswear? Their “popularity” is the general answer to the question; but dig a little deeper, and it becomes apparent that popularity is something that can be gained.

Whether it comes from having the most references to the site, having most links to it, or being the most search content matching specific search terms, there is a multitude of ways to benchmark popularity and then tailor output to fit within these parameters. The people in Google know that if they are ranking unhelpful websites on page one, then their customers will be going elsewhere for their “Googling.”Thus, they alter their algorithms a fair amount.

Google Panda

Google Panda was Google’s 2011 attempt to restrict websites from being crammed with keywords and building link farms in order to increase their ranking. Quality, as ever, was Google’s priority here as they went about uprooting a whole industry that had been built up by exploiting the loopholes that were evident in the Google Search model.

Five years ago, it was possible, with some concerted effort, to put together a site that could dominate in chosen search terms and maintain a top spot with an increasing amount of links and cloned content. However, Google Panda put an end to all of that as Google started banning sites that had built up their popularity this way.

All of a sudden, sites that were ranking highly started to fall off the main page and into obscurity as Google’s indexing system would blacklist sites that had a negative mark against them. This was even possible for sites that were mostly original but had hidden away from plain sight practices that were banned under the Google Panda algorithm. This update was nicknamed the Famer Update as it put an end to link farms and set a number of business models into a tailspin.

Things to Watch Out for with Google Panda No nonsense! The algorithm is built from human test cases. Programmers analyse many sites and flag content that is off limits. They then build an algorithm around these test cases and work with the algorithm until it is able to function automatically. So, the most obvious lesson is don’t write nonsense. Grammatically incoherent work will get you flagged in no time at all. No duplications! If more than 90% of content on your page exists on another page on your site, you are in trouble. So be careful with your headers and borders. If you are repeating the same outline on every page, and your original content is minimal, you are in trouble. No advertisement overload! If your page is little more than an advert, you will be blacklisted. By all means advertise, but think smart. The algorithm is so advanced these days that you have to think of it in human terms. Would a human notice that you are hosting a site to link elsewhere? If so, Panda will too. No farming! Is there an overload of keywords on your site? While keywords were once the golden ticket to search engine success, now they must be used sparingly and with caution. Of course, you need to optimise your Google search terms, but if you have 100 links to ‘boost your libido,’ you are going to get picked up. No robotic content! If you have auto-generating content, you are in trouble. Panda identifies and blocks content that has clearly not been built by humans.

Cyrus Shepard’s August 2011 post ‘Beating Google’s Panda Update – 5 Deadly Content Sins’ is as relevant today as it was back then and we can see the forward thinking from Cyrus has been proven correct when reviewing the array of Panda updates since this algorithm update launch in February 2011.

Google Penguin

Once rules change, new ways to play the game are quickly figured out. While Panda hit many firms hard—SEO firms, especially—it was quickly realised that with some modification, it was still possible to nail a top ranking with some intelligent application of keywords. The preferred method at this time was to position keyword heavy articles on sites, often fictitious, and link back to each other. Much of this content was humorous and whimsical, whether for travel-focused websites, dating websites, or any other industry website that it is possible to conceive of. It was rumoured that famous authors were picking up decent fees for stream of consciousness writing filled with hot topic keywords and subordinate keywords.

Before long, the ever-alert Google machine realized what was going on and decided that further algorithm changes were necessary. Thus, the reign of the Panda was not over, rather it was joined by another anthropomorphic Google algorithm.

Quality was again the catalyst for the launch of Google Penguin in April of 2012. This update targeted what it referred to as “Webspam” with the intention of penalizing sites that did not meet expected standards of quality.

Again, the topic of quality is one which seems to be defined in a rather subjective manner by Google, but overall perhaps ‘usefulness’ would be a better description. Once somebody puts in a search term, does the result they get actively address their enquiry or just point to another site that is off topic? This was the motivation behind the Penguin shift.

Things to Watch Out for with Google Penguin Quality. The ever-elusive entity of quality can be measured in various ways. Is the content authoritative? Is it being linked to a variety of sources? How relevant are these sources? Is there a diversity of comments? If so, you will probably be able to pass the quality threshold.

Link relevancy. Stick to your niche. If you are posting links to sites that are too different from your own, then you may be flagged. Stick to a web of interconnected sites, and you will be able to build your niche positioning.

Organic linking. If you suddenly have an upsurge in links, this is likely to set alarm bells ringing. Links built up over time have more cachet and add to the perception of your site as authoritative and possessing quality content.

Diversity of anchor texts. Don’t repeat the same key words in your anchors as this is a major red flag. Use synonyms or similar terms rather than repeating yourself.

Link quality. Getting links from sites that are red flagged will also come back to haunt you. Be sure that when you are building your links that you are looking for link quality vs. quantity. Quality begets quality after all.

Jason DeMers recently had published yet another extremely informative guide on how to recover post Penguin 2.1 in October, Penguin 2.1: What Changed Since 2.0, and How to Recover, which should give you a good overview, but Glen Gabe’s follow up findings were also extremely insightful.

How to Maximise SEO in the World of Panda and Penguin

A good place to start is to analyse your Web traffic. If you have seen a sudden dip, then there is a good chance you have fallen victim to these changes. Google publishes the dates of updates to its algorithm, so compare these with the work you have already done. If you find that you have suffered on these dates, then you need to figure out how this particular update has negatively affected you and start to make some changes.

If you can target the issue and resolve it, wait for 20-30 days and check your traffic again. If there has been no recovery, then you need to go back again and make further changes.

As ever, Rand Fishkin leads the way when it comes to all things SEO, and his presentation at the Digital Marketing summit is essential viewing for all those who wish to understand the issues facing companies who wish to dominate on Google today.

In conclusion, the key lessons to take from all of this is that Google is going to keep innovating, and it is going to keep aiming to increase the quality of search results. Expect further changes in years to come as Google seeks to solidify itself and remain top dog.

Be extra careful with anything you publish on your site. If it is a copy-and-paste job, then you risk a ban. If you are full of links that seem irrelevant, you will be banned. Low quality content can bring your site at risk of being banned. Google’s algorithm is turning into an editor which expects high-quality content if it is going to be giving front-page space on its search engine. So, keep the editor happy with clean, well-written, and interesting prose; connect with a professional SEO company, and watch your stock rise.

The post How Panda and Penguin Algorithms Negatively Affect Your Website appeared first on SEO Chat.

Source: SEO Chat | How Panda and Penguin Algorithms Negatively Affect Your Website | 20 Jul 2014, 3:25 pm

The provider of affordable website hosting presents - BEST Social Media Tools: Are You Using Them? (Collective Mastermind)

We are talking social media tools today!

In our panel today:

Shannon Hutcheson, @ldylarke, an independent SEO consultant with huge experience in copy editing, social media marketing and link building Barbara Boser, @BarbaraBoser, Presidential Diamond Distributor at It Works Global Meghan Riley, @pixiechaser, “Writer, dreamer, gamer..” Clayton Wood, Executive with broad experience in building and scaling SEO Kyle Sanders, Owner of a digital marketing consultancy and several eCommerce businesses. Don Sturgill, @Don_Sturgill, Writer and social media enthusiast Steve Toth, Content Marketing Manager at @TechWyse

Questions we’ve been discussing:

Which tools let you multi-task between various social media sites? Are you using any scheduling tools? Do you use any tools to track who of your followers have influence? Best tools to manage and grow social media interactions? Any other social media tools you are using on a regular basis? Please share! Q. Which tools let you multi-task between various social media sites?

A. ldylarke (SEO Consultant)

For managing my social media tasks, I have found Hootsuite to be the most cost effective for multiple accounts. I prefer Hootsuite because I can add unlimited social media accounts for Twitter, Facebook AND Google Plus. When I post content via social media, I can select up to 5 accounts to post to at once. For instance, I have a personal Twitter account and a site related Twitter account. When I post content related to SEO, I want to post to those two different Twitter accounts, the site’s Facebook and Google Plus pages as well – for a total of 4/5 accounts per posting. All done at the same time and from the same place.

A. BarbaraBoser (CEO)

I am an avid user of Hootsuite. I use the bulk scheduler to upload my monthly status updates. I have all of my networks added to Hootsuite and can easily go back and forth between each. Mind you I have never used anything else, but Hootsuite serves it’s purpose for me and does exactly what I need. I typically get all of my statuses updates and then review each one then add images. Doing this for the month can take about 3 hours, but it is well worth the time. I feel that it has helped me immensely in terms of building my brand.

A. Meghan Riley (I Love Social Media Tools!)

I primarily use Hootsuite, because I can set up my social media sites to appear on separate tabs that I can quickly flip between. It doesn’t have all the features I want, but it helps me see multiple accounts at once, so I can keep on top of trends and information I might want to share. It also allows me to schedule multiple updates at once, which really comes in handy when I know I’m going to be away from the computer for an extended period of time. I use it for my personal accounts and the accounts for my real estate team.

Q. Are you using any scheduling tools?

A. ldylarke (SEO Consultant)

Hootsuite Pro also allows me to schedule posts quickly and easily. I have also used Buffer, but why pay for more services when you can get the same thing done at the same place? I find Hootsuite more than able to handle all of my usual social media tasks for all of my social media profiles. Why go anywhere else to use a tool when one can use a single tool to do everything one needs to do all at the same place and at the same time? Hootsuite Pro is not only reasonably priced, but it provides everything I need.

A. Meghan Riley (I Love Social Media Tools!)

As I mentioned before, I use Hootsuite to schedule posts when I know I’m not going to be near the computer for an extended period of time, like on the weekends or days I’m at luncheon events. I can’t schedule all of the posts types I like, so if I need to post, say, a video, I’ll schedule it directly through the Facebook scheduling tool. Hootsuite also allows you to shorten URLs in the program, which helps with tracking click-thru’s. But, when possible, I try to post directly to a social media site, because they often have more options to choose from, like the picture to use while posting on Google Plus.

A. ClaytonWood (Managing Partner)

We schedule our social media posts using Hootsuite. Since we’re a content marketing company, we manage dozens of different social media projects every day. This helps us stay on top of our posts. The thing I like about Hootsuite is that you can monitor different social feeds and networks in 1 simple view.

This allows me to publish great copy for my clients regardless of where they are located around the world. Timing is everything in social and this tool helps ‘strike while the iron is hot’. We use the Hootsuite Enterprise solution because of the flexibility and scale it brings to our organization.

Q. Do you use any tools to track who of your followers have influence?

A. ldylarke (SEO Consultant)

I don’t really track which of my followers have influence. I know who is noteworthy in my industry because their name pops up in social media or is well-known. If I needed to find out who was most influential, I could refer to Commun.it (I do get notifications now from this site), Followerwonk, or Klout. I find Commun.it is more than sufficient for me when I want to know more about a specific social media profile or account. I have searched for other social media tools that help you find influential profiles to follow, but I always end up back using just Hootsuite.

A. ClaytonWood (Managing Partner)

I like Kred, as a social influence tool. This tool measures both social influence and outreach on the same platform. With it’s influence metrics it mainly relies on retweets, follows, replies and mentions on Twitter. The same goes for how it measures Facebook.

The way it measures outreach is interesting too. it’s a total of how much you retweet, share, reply and mention other people. This way, it can tell you how much your effecting other feeds. Each social action you or your followers make gets a score. On the activity page you can see what actions specifically have contributed to your over all Kred score.

Q. Best tools to manage and grow social media interactions?

A. ldylarke (SEO Consultant)

I like commun.it to manage and grow my Twitter accounts. I especially like the “consider to re-engage” feature. It lists who you haven’t spoken to for a while and will send a “hello, how are you?” kind of Tweet. This is especially helpful to remind you to keep in contact with those you want to. I also use commun.it to unfollow people who are either not following me back, or who don’t make very many updates (or at least fairly frequent updates) to their social media account. No point following these kinds of accounts. If they aren’t interacting much, then there’s no point keeping them on your social media list.

A. Kyle Sanders (Head of Search)

We primarily deal with clients in the industrial and mechanical spaces and LinkedIn has become an increasingly important channel for us over the last couple years. A colleague suggested trying Rapportive and it has proven to be a great asset. Cannot recommend it enough. Not only are we able to seamlessly connect via LinkedIn without a opening a separate tab (I have a proclivity for too many tabs), we’re privy to a litany of contact information and social updates, which basically morphs Gmail into a micro-CRM.

For anyone working client/customer management, it’s a great plugin that can streamline your communications and makes Gmail more useful and interesting. Plus, it’s free.

A. ClaytonWood (Managing Partner)

I use Followerwonk for helping to grow interaction with relevance. It’s a great tool because I can find active people with similar views or interests and start creating great relationships based around that. My strategy is basically to find, connect and learn from the person based on his tweets.

Research is the key to using this tool. So make sure you find the profile’s website, and other social media outlets. You want to build quality connections so that you can learn more about your field. There is a great ‘retweet’ tool in Followerwonk that allows you to filter our more spammy type of Twitter accounts.

Q. Any other social media tools you are using on a regular basis? Please share!

A. Don Sturgill (Writer)

Hootsuite has been my go-to tool for monitoring Twitter accounts … and I love TwChat for group chats. My real consternation is cutting loose enough time to stay in touch across multiple platforms. I look forward to tips from other members about time-saving tools and methods. I know social media interaction is important, but the time drain can be significant. Being able to simultaneously post across multiple social sites can be helpful … but it can also be redundant.

Another thing: It may be that listening is as valuable (or even more valuable) than talking on social media. Tools that allow me to search for pertinent terms and follow/join the conversation can be critical to social media success.

A. ldylarke (SEO Consultant)

ViralContentBuzz.com is really the only other social media tool I’m using. Other than Hootsuite Pro, I sometimes use RebelMouse but more often, Scoop.it to find and share relevant content to my followers.

While I understand why there is a minimum word requirement, I find I have to try to be overly wordy just to meet it. Especially where my answers do not require elaboration. Just saying.

A. Steve Toth (Content Marketing Manager)

Facebook text detection: This is a tool my company TechWyse built. It allows you to see if your ads meet Facebook’s 20% text guidelines. The tool has saved people thousands of hours of revising their ads due to them being rejected for having too much text.

We were written up in All Facebook.com and our tool has a steady user base that swears by it.

It works by placing a grid over you ad and it’s up to you to “punch out” the areas that have text. Once you have reached 20% of the area, the tool alerts you and it’s up to you to alter the image.

This tool is very handy for designers and social media managers. We hope your readers enjoy it, too.

A. Meghan Riley (I Love Social Media Tools!)

Canva.com is my new favorite tool, because it allows me to create visually appealing graphics quickly right in my browser. There are templates to choose from or you can build a graphic from the bottom up. You can choose from multiple sizes of graphics (Facebook Post, G+ Cover Photo, Business Card, Blog Graphic, Pinterest Graphic, etc.). There are hundreds of free pieces to choose from (frames, stamps, text bubbles, infographic pieces, etc.) or upload your own. Additionally, you can choose from many other pictures and artwork, and pay $1 each for them, which is pretty cheap considering what stock photo sites are charging nowadays. I definitely recommend everyone checking them out.

A. ClaytonWood (Managing Partner)

I took the plunge and start using Signals from HubSpot. I’d classify it as ‘social’ because it shows you when people open your emails, and with the Insights tool, you can see social media data on any website your browsing in 1 click. From what I can tell, it’s using a LinkedIn API or something third party to gather all the website data and display it for you in 1 click.

Information like how large the company is, how many people work there, how old is the company, and what social connections do you have with the company is all there to see. It’s pretty awesome!

What’s your take? Please share in the comments below!

The post BEST Social Media Tools: Are You Using Them? (Collective Mastermind) appeared first on SEO Chat.

Source: SEO Chat | BEST Social Media Tools: Are You Using Them? (Collective Mastermind) | 18 Jul 2014, 8:20 pm

The provider of affordable website hosting presents - 5 Advanced Conversion Rate Optimization Methods To Increase Sales Online

In the past few years, Conversion Rate Optimization has become as essential to a business as SEO. If you’re not optimizing – and your competitors are – you’re missing opportunities to gain new customers.

While A/B testing is one piece of CRO, simply testing different variates of your site against each other is only one way for you to increase conversion rate. For most businesses, split testing will get you an initial lift but won’t go the extra mile to get a higher conversion increase.

Here are 5 more advanced methods for you to personalize your site. To make things easy, I’ve included real-life examples of each along with the software you can use – enjoy!

1. Personalization

There is absolutely no question that personalizing your site will lead to higher, sustained conversion rates. Giving each customer exactly what they want at exactly the right moment is one of the most reliable ways for you to generate more sales.

Here’s an excellent example from Gardener’s Supply Company:

The company noticed that it’s conversion rate was low from Pinterest users as compared to other sources of traffic. So, they worked with the team at Evergage  to show a personalized banner to each visitor who came to their page from Pinterest and giving them a special offer.

This simple banner saw a 3x increase in conversion!

Generating 10% – 50% conversion increases is not uncommon for personalizing a site. And not only is the conversion lift significant, it’s often sustainable, lasting for years (a lesser talked about fact in CRO is that A/B splits can have a shelf-life and stop converting as highly – that’s a topic for another article).

If you’re interested in personalizing your site, here’s a few resources for you to try:

Evergage: Generally the easiest to use, with a point and click editor that makes it easy for you to segment audiences. Optimizely: Makes it easy for you to create pages based on traffic sources – similar to Evergage but in my opinion not as robust in personalization Personyze: Focused on e-commerce. Makes it easy for you to offer different products to different users. different creative, etc. The software is not as easy to use as Evergage or Optimizely but it does have some advanced features for e-commerce. 2. Dynamic Pricing

For any e-commerce business to succeed, pricing must be competitive – I cannot find a clear study on this, but anecdotally I can tell you that your pricing strategy can easily affect your revenue and profit by 10% – 20%. Especially if you are selling a product in a competitive environment, you need to have a handle on your pricing strategy.

Let’s say that you type into Google “laptop case”:

Then, you decide that you want the pelican 1085 hardback case:

Which one are you going to get? This is a heavily researched question, but I’m going to bet that you’ll go with the retailer with (1) the lowest price + shipping, (2) the best return policy, and (3) the best overall brand experience.

Even this simple example raises a number of interesting points:

How do you price on intent and product? Clearly a commoditized sale (like this case, where the product is the same everywhere) has a far different pricing strategy than an intent-driven sale, where a customer comes directly to your website What do you optimize for? Conversion, profit, revenue, etc? Most people will simply say “profit” but this turns out not to be an easy question to answer – simply acquiring a customer will often have benefits (like long term customer value)

For most e-commerce merchants between $500K and $10M in revenue, simply optimizing price for conversion is going to be your best bet. There are a few platforms we recommend for doing this:

Wiser: Software is easy to use. Unfortunately they do not offer plans for smaller size businesses but they are a great fit for clients with $1M+ in revenue. AppEagle: This is a great product for an Amazon-focused retailer. They are not as strong though for retailers who sell across multiple channels. 3. Dynamic Landing Pages

Simply adjusting the headlines of your landing page to match – or closely match – the search query text can boost conversion rate by 10%.

Here’s an example:

Personalizing your landing pages to PPC traffic is a general best practice, and fortunately it’s extremely easy to do this. The best tool for you to use for this is:

Unbounce: Probably the easiest landing page builder. You can easily create different versions of your landing page and then customize the headlines or background images for different traffic sources.

You could also simply create duplicate versions of the landing page and use dedicated landing pages – many DIYers do this and it’s a perfectly good practice.

4. Speed

Just a one-second decrease in page load time can decrease conversions by 7% or more.

Here’s an excellent graph from Web Performance Today showing how page speed affected conversion rate at Walmart:

I often find when optimizing conversion rate of clients’ websites that page speed is overlooked – not because people don’t realize it’s important, but because people often don’t realize that their page speed could be improved.

Here are a few excellent resources for you to check your website’s speed:

Pingdom: This may be the single best spot for you to check your website for performance issues. GTMetrix: Also a great resource; results will be similar to Pingdom Google Page Speed: For developers, Google offers a page speed API that will let you build speed testing into any single one of your apps.

Generally, the best tip for you to optimize page speed is to reduce the amount of code that you’re using on your site or to upgrade to a faster platform (for example, Angular.js, if implemented correctly, can be extremely, extremely fast). This is easier said than done, however, and is a topic for another article.

5. Response Time

For any advertisers who are in the lead-generation business – or even customer support – response time is one of the biggest variables that affects whether your leads become paying customers (or whether your customer support customers end up referring you or not).

Lead Response Management did an impressive study showing the best time to follow up with leads and how your speed affects response.:

If you call a lead 5 minutes after they email or call you, your chances of reaching them are 100 times better than if you call them 30 minutes later. If you call a lead in the first hour, your chances are 10 times better than afterwards.

Of course this can be difficult for international-based lead generation businesses, but for businesses in a real need – say for example, an auto accident lawyer, where there is urgency – being the first person to respond to a lead can often seal the deal.

Here are a few resources for you to try in order to improve your speed management:

Infusion Soft: One of my favorite CRMs and has an excellent marketing automation solution that will let you auto-respond to leads immediately. If you choose to automate your response emails, be sure to personalize them. Aweber / Mailchimp / etc: Any decent email marketing software will have auto-responder features built in.

For higher value sales, I would recommend trying to respond to as many leads as you can yourself, but for high volume lead-gen, an automation solution might be best.

About The Author

Andy Hunt is the founder of UpliftROI.com. He’s got years of solid experience in conversion rate optimization and digital marketing in general.

The post 5 Advanced Conversion Rate Optimization Methods To Increase Sales Online appeared first on SEO Chat.

Source: SEO Chat | 5 Advanced Conversion Rate Optimization Methods To Increase Sales Online | 11 Jul 2014, 10:30 pm

The provider of affordable website hosting presents - Simplify: Email Marketing Using Autoresponders 101

Are you depending on Google to keep sending you traffic? What if you stop ranking so well? Or your site gets a penalty? Maybe you are paying for traffic over and over. Why would you want to do that? You could be capturing each visitor to your list and keeping them there. That is what auto-responders do.

By offering a series of useful email messages to your visitors you can get them to give you their email address and keep them coming back. The best way to grow your list is to offer something of value. At one time most people used eBooks. The problem with that is the high percentage of subscribers who download what you offer and then immediately unsubscribe. If you send the same information as an autoresponder series, your subscribers have to stay on your list.

This works – but many are not doing this as well as they could. Think about the emails you receive. Personally, I read the short ones as soon as I open them. But the long ones I often save for “later” – but that “later” never comes. I bet many other people do that, too.

Goal: Get Your Emails Opened

The goal is to get your emails opened. By sending really short emails your subscribers will be more likely to open them. They may not do this consciously, but subconsciously they’ll associate your emails with “short and painless”. That is what you want. Their goal is to get information in easily digested bites.

Your goal is to keep them opening or at least wanting to receive your emails so when they are ready to buy what you offer they remember you. Search traffic converts because people use it when they’re ready to buy. If they remember you they won’t need to search – they can just click on your link in your emails.

Remember to ALWAYS include a link back to your site!

Simplify the Process

What is easy gets done more often. The best way to create your autoresponder series is to use a template you can edit each time. That way your header, logo, slogan, social sharing buttons, and links to your social networking accounts are already in place. You just edit the text.

Email solutions providers are making templates easier to use. There is a video about autoresponders that explains the many features that are now built into templates. If you make your messages flow in a sequence, your subscribers will be more likely to be happy to receive them. Here’s also a quick and sweet intro to autoresponders by ProBlogger.

One way to do that is to write a how-to and then break it up into small sections. Or you can make a list of priorities in order and write each one separately. Do whatever works best for you. Remember to always focus on what your potential buyers want – NOT what you want.

Give your subscribers what they want to receive and you can keep them on your list reminding them to choose your company. Over time you build trust that increases conversions. Do this consistently and your profits will grow.

Mobile Responsiveness

Studies continue to show increases in the percentage of email being opened on smartphones. Make sure you have updated all your email templates and landing pages to work on mobile devices.

Do not assume they work. You need to actually test them on as many devices as possible. Verify that the header, images, and videos resize correctly. If they don’t you will need to update your site with new code, a new theme, or plugins.

Click on the social media accounts and verify they take your readers to each specific social account. Test every social sharing button – or at least the major platforms – and ensure that it works as expected. Configure shared tweets to include your Twitter username and make sure images share properly on Facebook, Pinterest, and LinkedIn.

Repeat Visitors Are More Likely to Buy

Building your list, setting up autoresponders, and creating actual relationships with your visitors can turn missed opportunities into leads and sales. You must keep them coming back to buy.

The post Simplify: Email Marketing Using Autoresponders 101 appeared first on SEO Chat.

Source: SEO Chat | Simplify: Email Marketing Using Autoresponders 101 | 11 Jul 2014, 7:54 pm

The provider of affordable website hosting presents - Google removing author pictures from search: Your input?

Last week Google announced removing author pictures from search results while keeping the author name. Seeing author pictures within search results was a huge competitive advantage, so no wonder this step was criticized by many authors who were participating in Google Authorship feature.

From the good news: Participating is Authorship has been easier…

If previously you could never be sure if your author markup will make it to the SERPs, now all you need is to have your authorship correctly set-up (which may be a bad thing too as, let’s face it, it’s easier to have for anyone now) If previously you could only have ONE authorship snippet per SERPs, now you’ll all of them (if several of your articles have been ranked, all of them will have your name)

I have been discussing this issue around the web and have collected some opinions. My Google Plus thread has lots of great insights, please check it out:

 

 I especially liked this one from Shelly Cihan:

I support the removal. Knowing what a person looks like should not impact whether or not you click on a result in the SERP. It accentuated an already too vain society.

[Hard to disagree: Having an advantage in SERPs because your headshot looks nice doesn't seem fair at all!]

I have also collected some opinions from MyBlogU below:

Our interviewees were answering the following questions:

Do you believe Google has done that to optimize for mobile devices? Why not? Do you believe @JohnMu that will not affect click-through? Why not? Will you still care to verify your content after this change?

Let’s see what they think:

Q. Do you believe Google has done that to optimize for mobile devices? Why not?

A. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

I do believe that mobile probably did play some part in their decision to remove author images, but that is not the whole story for sure. They have been toying with author images for while now, and they have not gotten people to conform as they wanted. With low adoption rates by what Google would consider “real authors,”  and more people using it as a marketing tactic to stand out from he crowd, Google decided “enough was enough!” 

A. Swayam Das (Social Media Marketer)

Umm.. I really don’t think so ! Google always has a reasonable logic working behind their each and every move. So I’ll just wait and see how things work out on the mobile space! Mobile searches results tend to be location oriented so I don’t see much of a movement without any Authorship pics.

A. Marc Nashaat (Enterprise Marketing Consultant)

No, that’s not very likely. Google uses device detection to decide whether to serve up their mobile layout vs. desktop and they could just as easily style mobile to exclude authorship snippets. I don’t think it’s a matter of consistency as Google has been preaching the importance of different user experiences for mobile vs desktop for years now. 

A. Paul Shapiro (SEO Director at Catalyst)

I was a bit baffled at the decision to remove the author images from the SERP. I was a found believer that when Vic Gundotra left Google, it was not the end of Google+.

This change however, had me second guessing the future of the platform. Surely, the author images were a HUGE incentive for Google+ usage. Why in the world would they choose to remove one of it’s most significant features?

I have a number of theories beyond the typical answer of it helping pretify the SERP or creating a better mobile search experience:

Maybe it was negatively affecting AdWords CTR. Google wants more eyes on knowledge graph. Now that x number of people are using authorship, they care less about incentivizing it’s use or perhaps it started to lead to spammy usage. It detracted from the CTR of the ranking algorithm. Shouldn’t position 1 get more clicks than position 2? What if it weren’t the case due to an author image? Google wants to push personalized searches even more and the inclusion of images in those searches actually detracted from this. People would click on personalized search results much too often compared to regular results. They want them to be “blind” to it, by making it visually more integrated. Google is making big changes to Google+ and how it is integrated with other Google products. There are more big changes coming! 

A. Dave Rekuc (Marketing Director)

Probably not, if it were a mobile only difference, Google would only roll the change out to mobile devices, they’re smart enough not to treat their entire search audience as one unit.  I think what’s happened is a feature with good intentions wound up driving results that didn’t actually favor a better search experience, plain and simple.  Mediocre articles with author mark-up caught the eye in search results and good sites that were ignoring the mark-up got passed up.

I’m sure there are 1,001 conspiracy theories that believe that Google rolled out such strong authorship mark-up in their SERPS to lure contributors to Google +.  Totally possible, completely unprovable.  Whether it did or didn’t I think it’s fair to assume that Google + is here to stay and that ignoring authorship mark-up, even after losing the author’s image, is a fool’s errand.  We know the web is getting more social and we know Google is paying attention now, it’s easy to implement, I can’t see why an author should ignore it.

Q. Do you believe @JohnMu that will not affect click-through? Why not?

A. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

Absolutely not! Google is always trying to convince us they are not the big bad corporation, whose interests are aligned with ours. Though I respect John Mueller, I do believe this is just PR. There has been all kinds of testing done by 3rd parties already, that already confirmed author images increase CTR. How could it not have?! It was a fantastic equalizer in terms putting less emphasis on where you ranked on any particular SERP. 

A. Swayam Das (Social Media Marketer)

I do not believe in the fact that CTRs won’t be affected. Primarily because if I place myself in the Searcher’s position I would definitely click on results that had images beside them. To my eyes they serve as a signal of being genuine,  someone that holds authority.  For example, if I search for “diet pills” and amongst the 10 results I see a doctor’s pic beside a site then I’ll definitely click on that ignoring others. The reason is for a normal user he/she won’t be knowing which is an authority site.

A. Marc Nashaat (Enterprise Marketing Consultant)

Not particularly, putting aside the case studies, common sense tells us that a result with an image is going to stand out more than a plain text result. When things stand out, they get more attention. Pretty simple. I’m also curious what these observations were based on; whether they were SERPs where all (or most) listings had authorship images. If so, it’s possible that you wouldn’t see significantly higher CTR’s than on a SERP with all plain text listings. 

It’s hard to come up with alterior motives for Google on this front, maybe they’ve found that authorship detracts from ad clicks, but that’s just entirely speculation. 

A. Paul Shapiro (SEO Director at Catalyst)

The first thing I thought when I heard John Mueller say that the removal of author images in the SERP wouldn’t affect click-through rate was “Okay, that’s easy enough to test”. I doubted that Google would want to make a false claim about something that is so easily tested. Someone will release a study on this subject and we’ll know the truth soon enough.

A. Dave Rekuc (Marketing Director)

I don’t believe that even a little bit.  On a relatively clean search results page, you’re going to tell me that an author’s image doesn’t catch the eye?  In eye tracking studies, human faces come up all the time as one of the first places the eye goes.  We’re definitely going to see a drop in CTR on our articles.  Everyone is losing the article picture at the same time and that may soften the blow, but not every search result contained the mark-up and that’s where we lose our competitive advantage.

Q. Please share what you feel about that? Will you still care to verify your content after this change?

A. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

Setting up authorship is not really not complicated, and less so if you are working with Worpdress. There are plenty of plugins that make it even easier to implement. I would imagine it will affect adoption and participation rates moving forward. I think for the most part author verification has been a failed experiment that has mostly been used by internet marketers. Google knows that and wants to take away yet another edge from us G+ make be next! lol

A. Anna Fox (Blogger)

Google seems to be still showing up pictures in personalized results: Which means you need to seriously work on your G+ following!

The big news for personalized (logged in to your Google account) search is that _author photos may still show for Google+ posts by people you have in your circles. (h/t to +Joshua Berg). Every other authorship result now looks just like those in the logged out search example.

A. Swayam Das (Social Media Marketer)

This move by Google kind of coincides with the recent Google+ update! Personally I was wondering if this move is directly signalling a cancellation of Google Authorship in the near future. If that is so then I won’t be verifying my content. Has Google just removed author pics from search results or the entire authorship program? Depends!

A. Marc Nashaat (Enterprise Marketing Consultant)

I don’t agree with the change, but I’ve learned to adapt to the whims of Google. I will definitely still be using authorship markup. If you believe in the future of the knowledge graph, there’s no reason not to. At the very least you’re creating structured data for your content, and that’s never a bad thing. 

A. Paul Shapiro (SEO Director at Catalyst)

I’m going to continue to apply authorship to all of my writing. It still gives me a sense of ownership (especially within search) beyond a simple byline. I also think there are advantages beyond the author image. People can click to see other things I’ve written write within the SERP. It affects personalized search results (probably more important than author images honestly), and it open a world of future benefits in semantic search and the possibility of agent rank, should it ever be used beyond indepth articles (which is also a benefit).

My gut is telling me this isn’t the end of Google+, but rather one change of many to come in how Google will interacts with Google+ and how the Google+ team functions as an organization. Interesting times are ahead of us.

A. Dave Rekuc (Marketing Director)

I honestly think it’s crazy to consider not verifying your content just because the short-term benefit of the author’s image has disappeared.  Google has proven a commitment to making Google + work and to making it’s search results more personalized.  They’ve created a way to structure your contributions across the web and personally build an authority that transcends domains.  I think any content creator would still be foolish to ignore authorship at this point.

Now, what’s your input?

The post Google removing author pictures from search: Your input? appeared first on SEO Chat.

Source: SEO Chat | Google removing author pictures from search: Your input? | 11 Jul 2014, 7:43 pm

The provider of affordable website hosting presents - Long-term SEO in Competitive Niches: How We Survived all Google Updates

[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]

Barry Schwartz has listed the most competitive niches in SEO: gambling, mortgage, hosting, real estate, travel, etc. We are into grey/black-hat SEO in one of these niches for 7 years already. Our sites have been in TOP10 by “online casino/slots/blackjack/…” and still remain there by less competitive but high ROI keywords. We have started with black hat – still, we’ve invested much into long-term SEO as it was obvious that Google will be improving its algorithms. Most of the sites where we applied a long-term strategy were not hit either by Google Penguin or by Google Panda updates.

Famous Moz Search Engine Ranking Factors Survey investigated on the weight of top SEO ranking factors in Google: 40% – links, 31% – content, 8% – brand, 8% – user/usage/query data, 7% – social, 6% – other. At the same time, in really competitive niches content and user/usage/query data are not an issue – you have already done everything possible by default – just because all of your competitors are doing this. Thus, sites with good content are competing for influence by means of a backlink profile.

If you have a “Dentist Eaton Colorado 7th Street” site you may use natural link building: local business directories, interesting blog articles, sponsored links. And you can claim that paid links are wrong just as Rand Fishkin does. Still, there are really competitive niches where it’s just impossible to get enough relevant natural links – casino is an example. All competitors use gray/black hat and you are forced to do the same. We are monitoring casino SERPs for years – there’s only couple of sites (out of hundreds) that use natural link building. One remark though: they all are more than 10 years old.

How to get links in competitive niches 1. On-the-budget techniques

Options

Pros

Cons

web2.0 links bulk blog comments forum profiles wordpress theme footers hacked sites etc.

 

very cheap (permanent link for $0.1-10) very fast (less than 1 month) easy outsourceable (a lot of freelancers/companies provide such services)

That always was a major target of Google’s webspam team. If they still work, it’s just a bug for Google which they will fix very soon.

Read LinkResearchTools article on how WilliamHill was penalized.

Conclusion: Cheap techniques should not be used directly for linking to the long-term projects.

2. Buy high-quality relevant / irrelevant links

Options

Pros

Cons

Good guest posts In-content page links (forget about footers, sidebars, sitewide links) affordable (in casino niche 1 good link from PR2+ costs $150-500 per year) fast (1-6 months) outsourceable (if you agree to pay double price of course, as trustful mediators may be greedy) if done right you can stay in TOP 10 for a long time (we track SERPs and most of top-ranked sites use paid links)

You don’t control linking sites:

Not agile: you want to change anchors because of Penguin 7.0 but webmaster doesn’t reply to your e-mails A lot of fraud:

some middlemen pretend to be webmasters, take money from you for a year price, pay monthly price to webmaster and disappear

Need to monitor sites daily in order to:

Keep a good neighbourhood: you don’t want to be posted close to “cheap viagra” link or at a page with 30 outgoing links nearby Source sites may be penalized Sites may be not working for weeks because for webmaster it might be not that important

Conclusion: Often worth the costs yet you don’t have any competitive advantage – competitors can see in Majestic where you buy ads and buy there too. Sometimes you just cannot find relevant links and are forced to buy irrelevant ones – they have less value and may dilute a site topic.

3. Build high quality relevant links

Options

Pros

Cons

Own sites Own blogs competitive advantage complete control cheaper than bought links in long-term perspective additional ways to build links: exchange links additional relevant traffic need a proven way to make many high-quality Panda-proof sites need to support sites: add content, buy hostings need to make sure that nobody can connect your sites need to find ways to get many links to these sites

Conclusion: If you don’t make your own sites yet you should at least think about it. It’s very tempting – but you have to do it right.

What to choose?

Option

Price

Speed

Quality

Control / Agility

Risks

Buy cheap links

low

1 month

low

low

high

Buy relevant links

high

1-3 months

high

average

AVG

Build own relevant links

AVG

3-6 months

high

high

low

We recommend to combine 2nd and 3rd options:

Stop buying low quality links immediately Start or continue buying high quality relevant links but choose partners carefully Make your own sites linking to your important sites to reduce risks; use them also for link exchange, reducing the budget for buying links Creating hundreds of sites: how to make it wrong

Our first sites used automatically synonymized content. Links from our 20 relevant sites promoted our important site to #3 in “online blackjack” SERP for 6 months, same with other casino keywords. Unfortunately, these days are gone. You need to create a readable content and think about security because Google’s algorithms become more sophisticated every year.

Using WordPress or other widespread CMS is a bad idea

That’s the first thing that comes into SEO’s mind. Many SEO gurus will tell you how to use WordPress for SEO. Still, if you want to make more than 10 sites – don’t invest your time and money into it.

If Google can detect that most of sites linking to you use the same CMS (like WordPress) – it’s not a natural pattern so it’s a good reason to penalize the site.

Here are some ideas on how Google can detect WordPress:

Inline text Powered by WordPress <meta name=”generator” content=”WordPress 3.8.3″ /> <!– This site is optimized with the Yoast WordPress SEO plugin

<!– Performance optimized by W3 Total Cache

Source files in the same directories Images, CSS, JS in /wp-content Links to /wp-includes Existing URLs /wp-admin (shows login page) and /wp-login.php /xmlrpc.php (shows “XML-RPC server accepts POST requests only.”) RSS Feed format <generator>http://wordpress.org/?v=3.8.3</generator>

(that’s my favorite because all forget about it; Google sees that all your linking sites have the same WordPress version, all are updated the same day – not suspicious at all)

Considering the fact that 10% of all the sites are using WordPress, Google obviously has a WordPress detection algorithm in the ranking formula and updates it regularly. If 50% of your links are from WordPress sites you may be penalized soon.

Same goes with all other popular CMS: Joomla, Drupal and even frameworks like Symfony, CakePHP. The common rule is to use technologies that most webmasters use (PHP is more preferable than Java), or those that are used by less than 1%. Google is smart enough to detect widespread technologies. It will notice that you use PHP (as most sites use it) and having all the sites linking to you built on PHP won’t be an issue. At the same time, WordPress is used only by 10% of webmasters. Therefore, you’re unlikely to wish Google recognize that all the sites you’re being linked to are built on WordPress.

It’s better if your CMS is not open-sourced: in such case, it is much harder for Google and people find connections between your sites.

Fingerprints in custom CMSs

The first thing you should remember – “NO FINGERPRINTS”. If there is something same in all your sites then Googlebot will find it; if not – your competitors will find and send to Google team. Here are some ideas on what you can do wrong:

Tech stuff: Same IP or C-class network (11.22.33.44 and 11.22.33.45) Same NS-servers (IrishWonder has article how Interflora got penalized) Same WHOIS Same domain registrar Nearby code Google Analytics (UA-1043770-1, UA-1043770-2, …) Google AdSense Same code Your own statistics code Your banner management system Same code in header/footer Paranoid Log in to Google services (Google Analytics, Google Webmaster Tools) from the same computer Visit many of your sites in Google-controlled software in one session: Google Chrome, Browser with Google Toolbar Find many of your sites in Google in one session: “site:example.com”

It’s hard to go too paranoid in this matter. Check everything so there is nothing common:

HTML code Scripts code (own and 3rd party like Google Analytics) Filenames URL structure Server headers Outgoing links format Same unusual robots.txt format etc.

Find a good hacker and ask him to point out what’s in common for the given sites. Give him several of your sites and some of competitors, give a task to realize which ones belong to the same owner.

Hundreds of sites: what we did Content & Design

Content should be cheap yet unique and readable. Make sure that checks for duplicated content are either a part of your business process, or it’s automatically integrated in your CMS.

It’s a bad idea to get a free/paid WordPress template and use it. Google knows it doesn’t take much effort to create your site. It’s easy for Google to replace all content blocks with “Lorem Ipsum…” and compare screenshots. Consequently – yes, look also matters,and synonymizing <div> classes is not enough.

Many CMSs make creating design templates overcomplicated. Make sure that it takes no more than a day per site to created – and you should be guaranteed to get a unique design.

Support

Things you should do:

Track all domain and hostings information: When domains/hostings expire Which domain at which domain registrar, hosting, identity (WHOIS) What are the contact details, login/passwords, secret questions for each registrar and hosting What is IP (track if it’s changed; don’t buy hostings nearby) Check if your sites are live 99% uptime means that 3 days in the year your sites will be down; if you have 100 sites than in average each day some of your sites will be broken and you need to fix it or move to other hosting as soon as possible Track and check all external links. If you have 10-50 sites, you can still use Excel. Otherwise, find a more automated solution.

We have only 1-2 sites at the same hosting. It’s your choice to decide how many sites are allocated to a hosting.

Also, don’t register all domains under the same registrar. It’s too suspicious if your sites have links only from GoDaddy sites.

Link placement

For example, you have 100 sites linking to your 5 important sites. You have decided to publish 3 links from each homepage. In total you will have 300 links – that means 60 links for each important site.

You should post links not only to your sites but also to other trustworthy sites in your niche (even to your competitors) to make it look more natural. Let’s say you have decided to make 4 additional links from homepages and 7 additional links from inner pages to other sites. That comes up to 1300 links.

You can find relevant sites and ask them for link exchange. This is how you get 1300 links from other sites. That is 5 times more than from your sites only and is less risky because it’s looking more natural.

Tip: Always make a noticeable “Contact Us” link from the homepage so that people who want to exchange links could contact you.

Get a good software to track links because:

You want to link only to live sites (no broken links) If your link exchange partner removed your link, you should know about it in the same day

Usually 2 programs suffice: CRM and link checker, despite it would be nice to have them all integrated.

Budget

That really depends on your needs. We have several kind of sites: simple ones (15 pages) costing around $300 per site and more advanced (30 pages, better design and content) – $600 per site.

Therefore, our budget for creating 100 sites is $30,000 to $70 000. As we’ve calculated, you can get 1600 links from those sites (300 links from sites directly and 1300 using link exchange). That’s $20-40 per permanent link. Hope you expect your sites to live at least 3-5 years so you can split expenses between several years – estimated expenses are $4-13 per link per year. That comes up to a much lower price than buying links from other sites on one hand ($150-500 per year), and you can be completely sure of the quality on the other hand.

Of course you should add:

Support costs: domains, hostings, maintenance Linkbuilding price for these sites (cheap ways can be used here) Automate everything

The catch is that you need to have the process and software to fit in the budget described. It may take from 6 month to a couple of years if you decide to develop it by yourself. Still, safe future is more important, isn’t it?

CMS features you may need:

Backup. Hostings can be down and sometimes you lose all access. We always have the latest content and some real-time data like contact us forms, subscriptions, polls, visitor statistics are collected each 3 hours Easy migration. If some hosting becomes slow or not working at all, you might want to move your site to another hosting. This should be a very easy process. It should take minutes, not hours to transfer site from one hosting to another, and it should be simple to configure a site at a new hosting. Checking site availability. Pingdom will cost you a fortune if you have hundreds of sites to check. Still, if your sites are down this may eat up a part of your budget that exceeds your costs on Pingdom or similar services. We have developed our own system for that because we needed additional information: which hosting is used now and was before, how important is the site. Also, we needed to detect some errors that Pingom considers acceptable (visible PHP-code, missing </html>, etc). Easy learning curve HTML-developers. Use templating system that allows you to copy other site design and slightly modify it. If they spend less than a day per site and your sites don’t share same HTML – that’s enough for a working model. Copywriters. Make sure adding, modifying and uploading a page takes seconds, not minutes. Also, the process should be simple: your copywriter doesn’t have to spend a month on puzzling out your CMS. Automated error check. There is a lot of typical mistakes like unclosed tag. It’s not hard to check them automatically. Content history. If copywriter has accidentally removed something important that should not a problem. Automatic randomization. Even outgoing links to affiliate partners should have different format. Access control. Copywriters, HTML-developers and administrators should have different access levels Multi-user. If 2 copywriters decide to edit the same page the same time CMS should not allow that or at least notify. Conclusion

It’s tough to do natural linkbuilding in competitive niches. Thus, you should make it look as natural as possible. It’s a good time to stop using low quality links and raise the bar even for relevant links. If you start making your own sites now you will be prepared to the next Google updates and will have a competitive advantage on top.

There are a lot of issues with public CMS. Subsequently, you may need to develop custom solution. Development of a CMS and several hundreds sites may cost $300,000-$500,000. Still, it will pay off even in current conditions. If Google continues to tighten the screws, it may be the only way to survive.

Featured image is used under Creative Commons License

The post Long-term SEO in Competitive Niches: How We Survived all Google Updates appeared first on SEO Chat.

Source: SEO Chat | Long-term SEO in Competitive Niches: How We Survived all Google Updates | 8 Jul 2014, 8:23 pm

The provider of affordable website hosting presents - 5 Useful Facts About StumbleUpon Traffic

StumbleUpon may have been around for awhile, but marketers have been mixed in their advice about using it. As far as social media tools go, it is much different than most. Rather than allowing for interaction, it is used as a discovery tool. You introduce people to your website through this randomizer, increasing the chances of your users finding you based on how many pages you have to share.

***Take part in the thread: Is SstumbleUpon any good for building traffic to my site?

They also have a paid advertising program. StumbleUpon Paid Discovery service links people directly to your pages without any clicks-through from ads. It is supposed to remove the most difficult step, as so many users are jaded about following real advertisements thanks to an increase in shady pop-ups and sidebar ads.

How StumbleUpon Paid Discovery Works

Using the Paid Discovery tool is simple enough. You sign up and then pay a rate per Stumble, so you are only paying for the people who see your link. There is a base pay, and then you add ala cart based on specifications.

For example, the base Stumble is $0.10 each. Adding things like location and age targeting are between $0.02 and $0.06 each, upping the price of each Stumble to as much as $0.35 a piece.

What you are ultimately paying for is traffic that comes directly from StumbleUpon. But is it worth it? Here are five facts about StumbleUpon traffic that you can use to decide if it is the right tool for your campaign.

SU Is Still a Social Platform: Don’t make the mistake of assuming just because it is a traffic driver that StumbleUpon is just another marketing tool. It is still a social platform, and one that is increasing its user base by the month. People like a content driven social network, as it provides a unique formula that takes it out of the usual micro or connecting spheres (think Facebook, Twitter, Google+ and LinkedIn). It is more comfortable among the category of Pinterest and YouTube, as it is there to push traffic through content itself, and not engagement. While you have to change your tactics of interacting, you still have to look at it through the same lens. For example, sharing your own content is fine. But it is not likely to get you a large following on its own. Instead, you have to engage socially by sharing third party content relevant to the interests of your target demographic. This will ultimately increase the numbers of users who regularly return to your stumbles, and so your site. Mint Has 180,000 Unique Visits From SU Alone: In probably its most enticing case study, Mint is a primary success for StumbleUpon. The financial site itself stated that SU was the most effective and cost-efficient form of advertising they had used, including an unnamed social network (ahem, Facebook) they had used for PPC. That number isn’t in total, it is per month. They managed to both increase traffic on a consistent monthly basis that continues today, while increasing their user demographic to include the elusive 18: 25 women category they had wanted to more strongly influence into using their product. Only a Percentage Of Traffic Will Be Paid: Looking at the Mint example again, all of the primary traffic came from free campaigns. Only 44% came from Paid Discovery. An additional 20% came from shared Paid Stumbles, so when they said it was cost efficient, that was obviously very much the case. SU’s other case study, the Wisconsin Milk Board, saw an addition 60% traffic increase from Paid Stumbles. So while you use Paid Discovery to increase your traffic boosts, there is evidence to suggest a fair amount of what you see will come from free Stumbles. Good Content Provides Increasing Traffic Over Time: Nicholas Tart of Income Diary presented an interesting look into his own use of StumbleUpon. He said that he had submitted a single, high quality piece of content that was “content StumbleUpon users like”, and measured the results. On that single piece of content, he got an astonishing 158,000 Stumbles over time. Most of this started to happen five months after it was initially submitted, which teaches an important lesson: timing is different for SU campaigns. Where with other social networks you would hope to see a quick increase in shares, and possible viral status once in a blue moon, SU is a more patient form of marketing. It has to be planted and allowed to grow. Be sure to check out Tart’s article for some interesting advice on improving your results. Here’s also a very actionable article on getting traffic from StumbleUpon (supported by my own case study). SU Might Be The Best Hidden Treasure On The Social Web: Check out this post by Shareaholic. In the beginning of 2014, SU saw a 30% increase in referrals. In fact, it saw the highest increase of all of the social networks, along with Facebook, Pinterest, and Google+. Sites like Twitter, Youtube, Reddit and LinkedIn saw a fall in referral rate. Granted, the numbers for SU could be because marketers started to really catch on to the platform’s potential in the last two quarters of 2013. But it doesn’t change the potential seen in those gains. StumbleUpon might be the best hidden treasure on the social web, and really worth a shot if you are failing to see the traffic or influence you hoped on more saturated, less content focused social networks, Conclusion

Nowadays StumbleUpon may not be the most talked about social tool out there. But it is one of the most promising, and it is growing by the day. The statistics speak for themselves, and seeing the progress made by sites like Mint using it is nothing short of inspiring.

If you are looking for a traffic driver that will be based more on content than on links, you might want to try it out. Less focused on building through clicks-through, you can see how it might be more appealing to the average social user. Plus, the competition is less fierce, thanks to its status as being under the radar.

Have you used StumbleUpon for marketing? How did it go? Let us know in the comments!

The post 5 Useful Facts About StumbleUpon Traffic appeared first on SEO Chat.

Source: SEO Chat | 5 Useful Facts About StumbleUpon Traffic | 30 Jun 2014, 3:39 pm

The provider of affordable website hosting presents - Google Payday Loan Algo Punishes Spammy Search Terms, Except On Youtube

Recently I put together an article about press release sites taking a huge hit in search rankings, presumably due to the “payday loan” algorithm which is supposed to target highly spammed keywords and sites using spammy techniques.

I spoke to an employee of a press release distribution company (who both will remain nameless) and they told me that the initial punishment occurred over the keyword “garcinia cambogia”, a keyword that gets more than 800,000 searches per month according to Semrush.

As I continued to write I decided to do a search for that keyword and see who the new results were. To my surprise I found a short YouTube video ranking near the bottom of page 1. After doing some research on the video I examined its backlink profile and came to the conclusion that the site was ranking purely on the strength of pure spam.

This discovery got me thinking that perhaps YouTube, a Google owned property might be “protected” from such actions. After all the more traffic their videos receive, the more revenue they can generate through ads.

I decided to check out some other keywords to see if my theory held true in another niche. After some consideration I decided to focus on a local seo keyword, such as “city name seo”. I wanted a term that would have value and a term that would have some good search volume.

The keyword I settled on has roughly 500 searches per month for its city name “seo” and could potentially generate a few hundred more visits by ranking for other variations of this same keyword.

Lo and behold I was able to find a YouTube video ranking in the 6th position for this keyword.

Well, if it is ranking in the top 10 and Google is attacking spammy backlinks, then this must be a squeaky clean white hat video correct?

Think again!

The video has 65 views yet it has 1700 backlinks from almost 300 domains. How does that happen? How can only 65 people viewed the video yet 1700 links been created for said video? Perhaps the links are quality, so let’s take a peek!

After checking the backlink profile on Majectic SEO most of the links are coming via blog comments. Wait a minute, blog comments can be white hat right?

Of course they can but when the anchor text is either exact match or some variation of the main keyword then it screams spam. Don’t take my word for it, take a look yourself!

Notice that this page has been spammed to death and has some unsavory keywords on the same page as the “seo” keyword. I have marked out most information since I just want to point out the facts but do not want to “out” the video in question.

I think it is pretty clear that the site is simply using YouTube as a “host” to spam and rank.

In light of how Google has handled some news sites and the press release distribution sites I find it rather interesting that they are punishing these domains in the name of “search quality” yet their very own property can be used to rank for some of these keywords using the shadiest of tactics with no ill effects?

What are your thoughts?

The post Google Payday Loan Algo Punishes Spammy Search Terms, Except On Youtube appeared first on SEO Chat.

Source: SEO Chat | Google Payday Loan Algo Punishes Spammy Search Terms, Except On Youtube | 28 Jun 2014, 3:39 pm

Microsoft
Dell

html

css