Google will often test new features, UI improvement(s)/ideas, etc... by temporarily making it available on the homepage, they track the analytics and it lets them know what people think, when they do this its never to everyone using Google, just a select group, region, sometimes country (they did a test just in Britian once as example) You might have got a test page and no one else saw it because its one of there tests.
fhdendy
Posts
-
Google -
Girlfriend MarriedI dunno, maybe I am just evil, but I would call her new husband and ask him to tell her I'm done with her, spice up the honeymoon and all that...
-
SEO / SEM appsViral is explosive, always explosive and there will be a lot more mentions of the links found in personal online media, like social "wall" posts, inside emails, and social bookmark sites, way more than a traditional site would be found in mass on these types of media. If you think about it it makes sense, viral means people are bringing other people, and these are today's best at letting others know when you found something cool. The difference between white and black can be a ton of factors, it sounds like you want to know the break thresholds for when once type of activity turns into the other. Because of ALL the different factors they take into consideration, it would be impossible to have a defined break. The use, from my point of view, would be to have another tool in my tool box working for me, and I would rather it not test the boundries but to always reside in the safe zone, that way I could sleep at night knowing its not going to get me blacklisted. As for how fast, again a lot of factors come into play but if your going after even decent keywords, and by decent I mean ones people are actually using to search with, then you would need to assess your own site and efforts versus what the top SERP results are doing in your respective area(s)
-
SEO / SEM appsIf it was me, I would write the application as a service and write it act like a real link builder would. They could insert multiple pieces of content to be submitted, including multiple versions (like changing the description, title, names, etc...) then the service would submit only 1 or 2 different versions of these at a random but defined pace, say 4-7 a week, with a total submission of no more than 10-20 per content/version. Also as a service you could put a random on which sites it hits and continually add and remove sites from the site list, so lets say one of those gets blacklisted so everyone leading out it gets s little bad influence, you can make sure no future submissions go there. It would be basic white hat link building automated and that would be a service I would pay for.
-
Website design problemYes this is done easily, if the site is dynamic make them includes, if its not and basic HTML use the Dreamweaver library tool and let it do all your work for you.
-
SEO vs. GRAPHIC DESIGNS?Yes in so far as the search engines look at things like file size, page load times, etc... But a large amount of images also give you several SEO advantages like img file names, alt tags, etc... The biggest thing you need to pay attention to though is that you don't use your keywords in your images, always make them text or your wasting your time, always optimize your images for the web, and pay attention to layout so your page is laid out in the following order usability/SEO layout/artistic design (which is different than good graphic design) SEO layout is basically the SE only read the first 350-550 words on a page, so don't create a layout where a bunch of junk comes before your real content. Hope these basics help
-
SEO / SEM appsThere is indeed some value to using this technique and if you could automate it in some way then average people, not SEO professionals would probably flock to the product, so yes you could potentially make something cool. The downside: Most of these type sites us nofollow tags on their submissions, the importance given and even how the SE view the nofollow does change but still not warranted as important a link as a normally one. Automated implies fast and efficient, in the SE world that would actually mean penalties to your sites in some cases. If a spider picks up that you just got mass inbound links it will assume you are trying to game the system, which you are, because normal link building is done slower and more spaced out. The only ones that do get credit is viral and they can easily tell whats viral and whats generated because viral will have even more explosive growth and will be found more in certain types of media. Again if it's automated you are probably talking about 1 to a few pieces of the same content on multiple sites, again this would be duplicate content on sites that accept submitted content, the two issues with that is the SE treat outbound links from these types of sites differently, otherwise it would be as easy as who can post the most content to hit top serps. The second problem being duplicate content would also hurt your link juice being passed. There are some cool things you could do to get around these and all of the other issues in your programming though, now if you did that, that would be a very handy piece of software to have.
OMPundit