Security Flaw in Google Webmaster Tools Showing http://wmxstasy.sandbox.google.com/malware as Recent Site
I was just looking at my Google Webmaster Tools account to review all my sites, but when I clicked on the dropdown menu of Recent Sites, I only saw two sites listed, both of which I have no affiliation with and one of which peaks my interest. The first site listed was oregonstate.edu for some reason. The second site was http://wmxstasy.sandbox.google.com/malware. When I tried to view the data for both of those sites, I got a 503 error message so at least the data was not accessible to me. However, I shouldn’t have access to ANY data for either one of those sites so this is a huge security flaw or a minor glitch, you pick. I am curious as to whether the wmxstasy malware Google link is actually associated with one of my websites which happened to get flagged for malware. I took a screenshot which I’ve posted below, let me know what you think.
Big Data Analytics - what is it? Well, the term “big data” is relatively new trend which describes keeping track of huge datasets from all different traffic sources, including online, search, social media, mobile, traditional media and so on. Big Data analytics is the even newer term which describes the ability to not only track all of this data, but to make sense of it in order to use it to make important business decisions. Big Data is the natural progression of business intelligence tools, CRM and marketing automation software suites.
According to Wikipedia on Big Data:
“Big data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Big data sizes are a constantly moving target currently ranging from a few dozen terabytes to many petabytes of data in a single data set.”
“The practitioners of Big Data Analytics processes are generally hostile to shared storage. They prefer direct-attached storage (DAS) in its various forms from solid state disk (SSD) to high capacity SATA disk buried inside parallel processing nodes. The perception of shared storage architectures—SAN and NAS—is that they are relatively slow, complex, and above all, expensive. These qualities are not consistent with Big Data Analytics systems that thrive on system performance, commodity infrastructure, and low cost.”
“The impact of “Big Data” has increased the demand of information management specialists in that Oracle, IBM, Microsoft, and SAP have spent more than $15 billion on software firms only specializing in data management and analytics. This industry on its own is worth more than $100 billion and growing at almost 10% a year which is roughly twice as fast as the software business as a whole.”
As you can see, Big Data is going to continue to grow in the next 5-10 years so the opportunity is ripe for software vendors to come up with big data analytics tools.
One of my clients’ sites was built in MODx version 1.0.2 or “Evolution” by their former web development company. Today the RSS feed parser which displays the latest blog entries from a WordPress blog on the rest of the pages of the site stopped working. The current RSS feed parser is called “FeedParser” and I can’t see why it would stop working. The WordPress RSS feed is completely valid. I have tried to use other MODx RSS feed parsers such as getFeed but it appears I have to be able to add the PHP code as a new “Snippet” in MODx. The problem is that when I log into the MODx dashboard, I don’t see an option to add a snippet or manage snippets. I can add “Chunks” but those don’t let you add PHP code. So it is my assumption that I will not be able to edit the previous snippet or add a brand new snippet until the user permissions/privileges are upgraded. Which requires my client to contact their previous web development company to get them to upgrade it for us.
My question is – is there a way to add an RSS feed parser to a MODx site without the use of snippets? The getResources snippet only works in versions 2.0 and above, so that’s not an option. Otherwise, I am screwed until the client gets in touch with the other company, which who knows how long will take. Meh.
I am watching the Heisman Trophy award presentation and I love the Superman socks with cape that Baylor quarterback Robert Griggin III showed to everyone on stage. It’s the commercial break now but I have a feeling those Superman socks are going to propel RB3 to win the Heisman trophy as the best college football player in the country within the next few minutes. I imagine these socks are going to be the top selling items for Christmas this year, especially if Robert Griffin the Third wins the Heisman as predicted. I did a search for “Superman socks with cape” and didn’t come across many search results, so I figured I would try my luck here to see if anybody happens to stumble across this blog post and can point me in the direction on where to buy these socks. I really want them for myself and for my nephews because they are going to be a big hit. I mean, c’mon, those socks are awesome by themselves but the cape is absolutely incredible!
I am trying to set up a dev server for one of my client’s sites which is hosted in Joomla. I am using AkeebaBackup to automate the transfer process but I keep getting the following error message “Uploading report has failed because the file is unreadable“. I tried Googling that phrase to see if other people have experienced the same problem but of course I can’t find anything related to Joomla.
Now I have no idea what I’m supposed to do. Is it related to file permissions on the live server or the new dev server? Ahhhhh… I’m clueless!
Want to know how to improve your website’s conversion rate but you have a painfully slow website? Easy, your first and only priority should be increasing the loading speed of your website before worrying about what color button to test this week. A slow website is the epitome of a website that won’t convert well at all. It’s pretty frustrating dealing with slow websites, especially when it’s a huge complex dynamic ecommerce website that has tons of people working on it at the same time. As a result, the bigger these dynamic websites get, the more complicated and thrown together the coding is and thus drags the page speed down even further. Don’t worry about SEO, Pay-Per-Click, email marketing, banner ads, Twitter campaigns, etc. until you get your website to load in less than 20 seconds! Ideally, your webpages should be loading in less than 2 seconds, tops!
Everyone in the search engine optimization industry knows that linkbuilding is a critical part of any SEO campaign. And 99% of them realize that anchor text matters – which means that a text link with your targeted keywords as the anchor text will help your site rank better for that particular keyword phrase versus having a link that does not have your keywords as the anchor text. So a link with “dog treats” will help your dog treat website rank higher for that keyword than a link with simply your URL “http://www.asdfafsafsdfsadfsadfs.com/dogtreetis.html” will.
So everyone agrees on both of those points. However, not everyone agrees on whether or not varying anchor text in links is better for SEO. Meaning, if you had 100 links pointing to your website, is it better to have all 100 links have the anchor text “dog treats” or is better to have 40 links with “dog treats” and 20 with “Bob’s dog treats” and 20 with “treats for your dog” and 20 with “click here for delicious snacks for your puppy.”
One school of thought is that having exact anchor text links for all of your links or as many of them as possible is better and that it’s just a waste of time to worry about varying your anchor text. I have done a lot of research on different websites’ in random industries and analyzed their backlink profiles and it seems time and time again, the sites that tend to rank the best are the sites that have the most exact anchor text links pointing to their sites. This should not be surprising since search engines are still technically in their infancy and they’ve always counted exact anchor text links signals to determine what a site is about and they obviously still consider them to be very important since these kinds of sites continue to rank very well.
However, the other school of thought is that it is critical for varying your anchor text in links because if you have 1,000+ brand new links just prop up out of nowhere and they all have the exact same keyword phrases in the anchor text, then these links are most likely unnatural and are only created simply for the cause of manipulating the search engine rankings for that keyword phrase, which the search engines definitely frown upon. Your site may get an instant boost to the top of the rankings if you have 1,000 brand new links with exact anchor text, but you will definitely drop eventually, whether due to an actual penalty of your site or via simply de-indexing all of these suspicious backlinks.
So my verdict – exact anchor text links are still very important and should remain a top priority, but it’s also important to vary your anchor text for the best chances of staying power. The search engines are constantly evolving and getting smarter, so the more varied and more “natural” your backlink profile is, the better off your site will likely do in the long-run. It’s not only important to vary your anchor text, but it’s important to vary the sources and the neighborhoods of your backlinks – meaning it’s good to have links from blogs, press releases, articles, quality directories, social media such as Facebook and Twitter, reputable community sites, forums, and so on.
First Link Priority – Does Google Only Count the First Link?
One of my favorite things about search engine optimization, or SEO, is the constant experiments that must be done to try and figure out exactly how the algorithms of the search engines work. Google has stated publicly before that there are over 200+ factors that are in the algorithm to determine a website’s ranking, but they keep the actual algorithm secret. They also constantly test and make tweaks to their algorithms so what works today may not work tomorrow. That’s why SEO professionals have to stay on top of the latest changes in order to not be left behind.
One of the most important factors that Google uses is the anchor text of text links pointing to a specific URL. Every link is counted as a”vote” for that website, and if a certain webpage has a ton of links pointing to it with the same specific keywords in the anchor text, Google assumes that “huh, all of these other websites are linking to this page for “keyword whatever” so that must mean it’s relevant for that keyword phrase and thus should be ranked higher in the search engine rankings.” This seems somewhat obvious nowadays but this shift to focusing on links from other sites rather than actual keywords appearing on a page is how Google separated themselves from the other primitive and spam-ridden search engines in the early years. Anchor text in links is still extremely, extremely important but the search engines are always looking for other signals to consider since putting too much importance on just anchor text in links is an easy way for webmasters to manipulate their search engine rankings.
Even though links from external sites are more important than internal links, internal linking is still a very important piece of any SEO strategy. What I mean by internal linking is simply using optimized text in your text links that point to other pages on your same site. So instead of using “click here” to visit a page about electric guitars, that link text should say “electric guitars” instead. This helps the search engines determine what that page the link is pointing to is about and “electric guitars” obviously makes a lot more sense than “click here” does.
But what happens if you have more than one link pointing to the same URL on a webpage? For example, if you wrote a blog post about guitars and the first link you used to link to a specific URL was “guitars” and then in the next paragraph you linked to that same URL with “electric guitars” and then in the last paragraph you used “click here to view our wide selection of guitars and guitar accessories.” You would think that because you had 3 links in your blog post, that you would be giving 3 “votes” to that URL to rank for all three of those keyword phrases. However, this is apparently not the case, at least not in Google. There have been plenty of experiments done by other SEO professionals and most of them have come to the same conclusion – Google only counts the first link that appears on that webpage and more specifically, the first link that appears in the source code. Which means Google completely ignores the other two links. This has come to be known as First Link Priority.
Now you might not think this is a big deal, but it definitely is, especially in ultra competitive industries. Oftentimes a seemingly small factor like this is what separates 1st place and 2nd place, when all else is equal. And in these ultra competitive and lucrative industries, ranking #1 instead of #2 or lower can mean millions of dollars in revenue every year.
One of the most commonly neglected areas of a website is its navigation. Most of the time the navigation shows up at the top of a website or on the lefthand side, and because this is the case, most of the time the navigation shows up first in the HTML source code of a webpage. You might not think twice about linking to your homepage with the text “Home” instead of “Electric Guitars Online” but now that you know that Google only counts the first link, then you’ll probably want to reconsider how you link to your internal pages. You can have the sexiest webpage design and the best content on the web, but if the search engines are only counting the first time you link to your internal pages then your SEO results may be limited.
Don’t believe me? Fine, here are some links to some of the experiments that other people have done to test out the First Link Priority hypothesis:
and a nifty tool to check your links for First Link Priority – http://www.firstlinkchecker.com/
WordPress vs Joomla vs Drupal – Which Open Source CMS is the Best?
Most of the websites I work on nowadays are built on content management systems (CMS) so the clients themselves can upload monthly calendars or change photos without having to contact and pay us to do it for them. And since most of our clients are local businesses with limited budgets, they don’t want to splurge on enterprise level content management systems so 95% of the time we’ll use an Open Source CMS since it is free (I know, I know that most have some kind of limitations but otherwise they don’t cost any money to use so that’s why I said they’re free.) Here are the top three open source content management systems that we have used.
1. WordPress – My personal favorite and I always opt for it if I’m given a say in the decision. I mean, hey, this blog is on WordPress along with my company’s site and scores of our clients’ sites. Yes, I know it’s natively a blogging platform but it’s come a long way the past few years so that we are able to customize it into a fully functional CMS. It’s easy to install and easy to use, especially for clients who are not very computer savvy. It’s fast (with the correct caching plugins) and lightweight compared to the behemoths of Joomla and Drupal. And perhaps the best thing about it is that it has a huge community of volunteers who have created very powerful plugins and most of the time you can use them for free.
WordPress Pros – Everything
WordPress Cons – I would normally say “nothing” but I guess there is one tiny downside of WordPress. If using the default settings, most WordPress installations will crash if they experience a surge in traffic. However, most websites will never get these kinds of huge surges in traffic so it’s normally not a problem and if there is the possibility that your site could get a surge in traffic, there are tons of freely available instructions for how to configure WordPress to handle an increase in traffic along with tons of free plug-ins.
Joomla – I’ve used Joomla on a handful of occasions, most of the time because a new client is already using Joomla as their CMS. Why anyone would ever choose Joomla vs. WordPress is beyond me, especially nowadays. Joomla is such a pain-in-the-neck to use, even as an experienced computer user. The entire user interface and admin panel architecture is just a clusterfuck of “modules”, “components”, “extensions”, “sections”, “articles”, “pages” and so on. It’s certainly not intuitive to use AT ALL. And of course they have ridiculous account access levels. You would think “Administrator” would be given access to everything within the CMS but after an hour of trying to figure out how the hell to change a meta tag on one page, you find out that you need to have “Super Administrator” access to make such a minor change. Of course, the Administrator still gives you enough access to fuck up and delete the entire site but you need God-mode Super Administrator privileges to tweak some keywords.
Joomla Pros – Nothing that I’ve found thus far
Joomla Cons – You name it – ugly, hard to use admin interface, non-intuitive, way too many pieces scattered everywhere, stupid account access levels
Drupal - A couple of years ago we used Drupal for a rather large local company for a big web design project. Drupal is a lot like Joomla in that it’s bloated software, not intuitive to use and even harder to customize. I would rank it above Joomla though because it is a lot more robust for large, complicated sites but it’s still a headache to work with.
Drupal Pros – Only thing is that it’s better than Joomla
Drupal Cons – Everything else
So the moral of the story is WordPress is by far the best Open Source Content Management System / Blogging software available so you should use it now!
Most of the websites I work on are on Linux/Apache servers so I can easily add 301 redirects via the .htaccess file. However, my new client is on an IIS server for which I don’t have access to so I can only add redirects via the httpd.ini file and then send back to the client to upload for me. I figured out how to add 301 redirects in the httpd.ini file at the page level, but I am having the hardest time trying to find a good resource about how to redirect the non-www version of the website to the www version. The following line redirects at the page level:
RewriteRule /oldpage.asp http://www.site.com/newpage.asp [I,O,RP,L]
But that obviously won’t work for redirecting the non-www to the www version of the site. It’s not an issue of duplicate content, but rather the non-www version simply shows a “Under Construction” error message while the www version works just fine. So it’s even more critical that the non-www be redirected to the www version. However, searches on Google to answer my question have been fruitless thus far. Most of the results are web forums with threads from people asking the same question I am asking and then of course, the question is never answered or the original poster figures it out on their own and never posts the answer in their thread, both of which are worthless to me as somebody looking for that answer.
I came across one forum which provided the solution below, but I’m not 100% sure it will work as expected…
RewriteCond %HTTPS off
RewriteCond Host: (?!www\.)([^.]+\.[^.]+)
RewriteRule (.*) http\://www.$1$2 [I,RP]
And then another blog post had the following:
# Move anything from non- www.example.com -> www.example.com
# e.g. example.com -> www.example.com
RewriteCond %HTTPS off
RewriteCond Host: (?!^www.example.com)(.+)
RewriteRule /(.*) http\://www.example.com/$2 [I,RP]
Has anybody had success redirecting non-www versions of pages to www versions on IIS using one of the two methods above?