Loading

Wednesday, June 15, 2011

Google Launches New Tool For Advertisers To Display The Network


Google Adwords Display Network New Feature

Google announced new tools to see the Google Network, which is said to give users a better fit, transparency and value seems campaigns.

A feature on the CTR, which shows how effective the ad relative to other ads on Google to see the same networking sites. "The behavior of users on the pages varies depending on the page are users," said Dan Friedman Google's Inside AdWords crew. "For example, users can interact with the product review page for your ads to ads in different locations. (CTR) will tell you how often users click on ads, but the CTR can not tell you how effective your ad is compared with other ads on the same page. "

The function can be accessed by clicking on "Ad Groups" in your AdWords account, click "Customize Columns" and choose "CTR relative" in the dropdown menu.

Another feature of printer sharing, to help advertisers measure online presence. Represents the percentage of times your ads are shown in the total number of impressions available for those who were eligible. "In other words, the percentage of impressions that give their share of voice online," said Friedman. Lost (RB) meter will tell you the number of impressions that are lost because of your budget. Lost IS (Rank) said that the number of impressions to lose because of the ad base.

Content Ads is a diagnostic tool (CADT), which explains why the ads do not appear on the screen network. Google will begin to let this small number of advertisers in this month, and then fully launch in July.

Another new feature is included Impression filter, which ensures that advertisers do not pay CPM for impressions that users have little chance of seeing.

Each of the new features is available for all languages ​​AdWords. Learn more.

Tuesday, June 14, 2011

SEO Google Analytics


Google announced a limited pilot SEO Google Analytics, which is based on the query data in Webmaster Tools.

"Webmasters have long called for better integration into Google Webmaster Tools, Google Analytics, Google Webmaster Central Blog says.

SEO Reports also take advantage of Google Analytics visualization and filtering further analysis, Google said. . For example, you can filter the queries that had more than 100 clicks and see a graph of the amount of each of these investigations have contributed to your clicks all the most popular searches "in search query data includes:

Information: The total number of search queries that have returned pages from your site performance for a given period. (These figures are rounded and may not be accurate.)

Question: List of search queries that return pages from your site.

Impressions: The number of times pages from your site was in search results and the percentage increase / decrease in average daily records during the previous period. (The number of days in each default period of 30, but you can change at any time.)

Clicks: The number of times your site is clicked on the search results for a particular request, and the percentage increase / decrease in the average clicks per day compared to the previous period.

CTR (CTR): The percentage of impressions that resulted in a click on your website and increase / decrease in the average daily CTR compared to the previous period.

Note. position: The average position of your site in the results page for that query, and change the previous period. A green light indicates that the average position of your site is improving.To calculate the average position takes into account the ranking of your site for a particular query (for example, if a query returns to its spot as the # 1 and # 2 result, then the average position of 1.5).

Administrators can use the search query data to verify the extent provided query keywords and compare impressions and click rates. It can also be useful for keyword ideas for paid search campaigns.

"We hope this will be the first of many ways to the surface Webmaster Tools Google Analytics data to give a fuller picture of the performance of your site," said Trevor Claiborne of Google Analytics team. "We look forward to working with members of the pilot project to help identify the best ways to get there."

Search Engine Patents and Panda


Bill Slawski is the president and founder of SEO by the sea, and joined the professional SEO consulting and Internet marketing since 1996. With a degree in English from the University of Delaware and a Juris Doctor from the Widener University School of Law, Bill worked for the district court in Delaware highest level for 14 years as court manager and administrator, as a coach and analyst / management. While working for the Court, the bill also began to build and promote websites, and SEO became a full-time in 2005. Working on a wide range of sites, the Fortune 500 pages of small businesses, the draft patent law blog search engine and also the white papers on its seobythesea.com blog.

What are the signals that can be used by Panda?

Eric Enge: Let's talk about some patents, which may play a role in the Panda 1, 2, 3, 4, 5, 6, 7 and beyond. I wish I had ideas about what the signals are used to measure either the quality or content of user participation.

Bill Slawski: I looked at the sites affected by Panda. I started early corrective SEO. I went through the sites explored through them, so for the problems of duplicate content within the same domain, then for things that are not indexed, there was, and went through the list provided by Google Base Tools Webmaster in their field.

In an interview with Wired Amit Singhal and Matt Cutts about this update, they said an engineer named Panda. I found his name on the list shows Googlers and read through his material. I also found three other tools and system engineers, named Panda and the other an engineer who writes about architecture and information retrieval. I concluded that the panda was someone who worked PLANET paper (more on this later).

For signals with respect to quality, we can see that the lists of questions from Google. For example, what your site is like reading a magazine? Would you trust people with their credit cards? There are many things on a website that could indicate the quality and make the page seem more credible and trustworthy search engine and leading to believe it was written by someone who has more than expertise.

Things are usually presented on the pages, for example if you get eight blocks, there may be signals. If we look at the planet brochure "Learning Tree massively parallel bands with MapReduce" his focus is not so much to look at the quality of the signal, or feedback from users, but rather as Google is able to take the machine learning deals with Decision trees and scaling it up to use multiple computers simultaneously. You could put a lot of things in memory and compare one against the other page, if certain characteristics and signals are instantly displayed on these pages.

Eric Enge: So, PLANET brochure describes how to make the process, which previously had a computer in the process of machine learning, and put it in a distributed environment will have much more power. And 'this a fair assessment?

Bill Slawski: It would be a fair assessment. Use the Google File System and Google MapReduce. It draws a lot of stuff in memory to compare to each other and changing variables simultaneously. For example, an approach to regression model type.

Something that could have been very difficult to use a very large file is much easier when you can be scaled. It 'important to think about what your web page, the signal quality.

Your approach is to manually identify which pages have high quality, content quality, presentation, etc. and use as seed set to use automatic learning process. To identify other pages, and how they can be classified according to these different characteristics, it is harder for us to determine which signals specifically search engines are looking for.

If they follow this with Panda PLANET-type machine learning, there may be other stuff thrown in. It's hard to say. Google may not be exclusively used this approach. They may have tightened indexing based on phrases and the highest in a way that helps rank and search results re-ranking.

Panda can be filtered on which sites should be promoted and demoted other websites are based on a kind of score in signal quality.

It seems that Panda is an approach to rehabilitation. This is not a substitute for the importance and rank of the page and two hundred more signs that we are accustomed to hearing about Google. It can be a filter on top of those where certain websites are promoted and demoted other websites are based on some kind of score signal quality.

Eric Enge: That's my feeling too. Google uses the term classifier, so you can imagine, either before running the basic algorithm or later, is like a ladder or a factor of up or down.

Bill Slawski: Right. This is what you hear.

Google, What Content You Should Have Your Home Page


The latest Google Webmaster Central video of Matt Cutts debates on page content. Taking into account issues such as depth of content and speed of the site, Google has made it big in recent history, the content of the site is useful to consider about these things too.

Question that Matt's address is: "More or less the content of the first page?" You can be too much, "said Cutts." So I would not be at home, which is 20 MB. You know, that takes a long time to download, and users who have dial-up or modem connection is slow, they get angry for you. "

"But usually, if you have more content on the homepage, there is more text Googlebot to find, as only images, for example, if you have images and captions - a bit 'of textual information can really do much," continues.

"If you look at my blog, I've had anywhere from May to October messages on my home page at any time, so I tend to turn to a little more content when it is possible, he added.

It can be seen on Matt's blog here, if you want a better idea of ​​how it does.

Webmasters Bing Refreshed


Bing Bing released some improvements in Webmaster Tools in an update called "Honey Badger.

"Realignment of today offers a simplified experience webmasters that allows them to quickly analyze and identify trends - while adding new features and unique to the industry," said a representative of Bing WebProNews. "Our goal is to help webmaster make quicker and better decisions and drive new prospects to their site by presenting them with rich graphics and better organized, relevant content."

Improvements include:

Scan Time Management: allows webmasters to configure the crawl bingbot for a specific domain.

Index Explorer: It allows webmasters the ability to access the index data in a specified domain Bing.

User and role management: Allows website owners the ability to give admin read / write or read access to other users of their site.

crawl is configurable per hour. Users can ask Bing crawling slowly during peak hours or explore faster during peak hours. Is the drag and drop functionality that allows users to create a mapping by clicking and dragging the mouse pointer over the graphic. columns can be clicked on to refine.

for more info: http://drshns.blogspot.com/

Monday, June 13, 2011

New Update Google Panda Approved


Google Panda Update

Google Matt Cutts spoke in a Q & A session with Danny Sullivan at SMX Advanced in this week and discussed the update panda, among others.

Many sites have criticized Google returns results that have been removed versions of the orignal content. Cutts said in a live blog session, "a type of team [is] working on this issue. The amendment is adopted, which should help with this problem. Constantly review the Panda. The change in the quality of the source algorithm research, no Web spam team. "

He says there is another change coming, and it remains unclear when Panda will be released in full international (in other languages). He also says they have not made exceptions manual with Panda.

We recall that the cult of Mac blog has hit the original update panda, and then when you change some of the dialogue ended up taking the new service from Google. Matt says, however, "We did not make any exception for the manual. Cult of Mac might have been confused because they have started to get all this new traffic blogging about it, but we did not do anything about the exceptions."

Yesterday we discussed some survey results of the Roundtable search engine has found 4% of the sites said they had fully recovered update panda. Other places have been finding a partial perspective recovery.On recovery sites of the update, Matt is quoted as saying, "The general rule is to push things and find additional signals to help differentiate spectrum. We did not push, which directly draw things. We recalculated the data that have affected some areas. There is a change that could affect the sites and make a difference. "

You can also Google the recall list of questions that webmasters can use to assess the quality of their content. Cutts spoke briefly on these issues, said: ". It can help as we recalculate"

He also said that what is called "Panda 2.2" has been approved but have not yet been implemented. "If we think you're fairly high quality, Panda will have a smaller impact. If you have sufficient expertise and nobody else has good content, even if you've been hit by Panda this site may not fall forever. "

That says a lot about the original content.

For more info: http://drshns.blogspot.com/

Friday, June 10, 2011

Schema.org - A New Approach To Structured Data For SEO


Only from time to time, search engines love to throw our merry band types of SEO Curveball occasionally to keep us on tenterhooks, and new toys and upgrades. Yesterday was a worldwide day of such structured data on web page design.

What is structured data?

Unless you lived under a rock for the past two years, you become whole "extract rich" - those great little search results help you stand out from the crowd in your organic rankings. Structured data added to Web sites using search engines to scan your data into different types of search results that the search for revenue. test scores, events, recipes, trade names, contact name, job titles and the same connections friend on Facebook has been a visible moment in the search results for "white list" of sites.

Making the right choice to use tags

Google said that "increasing the code is much more difficult, if every search engine information requested in a different way." - This is so true. Webmasters make a difficult decision to choose the markup is quite difficult. the depth and simplicity Microformats or RDFa creativity HTML5 working group adopted the micro-data? And the RDF / XML-based good relations, e-commerce?

It seems that search engines have made this choice for us to introduce a new standard, known as co-schema.org.

More choices in more than one service

opportunities to overcome inconsistent for tagging structured data, schema.org comes to the task of opening a batch of new units for webmasters to describe their web pages. Forms for movies, music, restaurants, local businesses, TV series and "intangibles" such transactions are all the new vocabulary. If you have a website with one of the types of data that describes the new forms, you should get excited! See the complete list - it's incredibly comprehensive.

How schema.org work?

Schema.org is based on micro data. In simple terms, different types of data or entity can be described by a vocabulary. Vocabulary for a device is described on the corresponding page on schema.org, if for example you have a music list on your website, simply referring Recording music vocabulary Schema.org.

To implement schema.org vocabulary, you only need to understand the attributes: itemscope, itemtype, itemprop and you must have the URL of the vocabulary at hand.

For more info: http://drshns.blogspot.com/

The Transition To The Domain Without Losing The Status In Google SERP


Are you afraid to change a domain name? Are you afraid of losing their position in Google SERP? In fact, Google already offers a comprehensive tool to manage a website waitress / blog that we have. It 's just that you probably do not know yet, and do not use these tools well. Now I'm going to share how to change the domain without losing your position in Google SERPs.

Here I think you've changed the domain name. While the old domain, if no longer used, but the connection settings for the DNS server name or you already own.

Here is his blog tutorial.

1. Log in to your account Google Webmaster Tools. If you have not signed

2. Create a new field and submit your sitemap as usual

3. Then click on the old domain is still registered to webmaster

4. Click Site Settings> Change of address

5. Choose a new domain with the Form 4

Description:

Here is my old domain http://mastukul.tk, while my new domain is http://www.unick77.tk

6. Click Send. That statement appears below. If you change your mind, you can cancel by clicking the Cancel button

We just have to wait, as the work of Google. Google will work to change the address up to 180 days. If you're still there on the SERP, which has not changed, try again to make the first tutorial.

Attention!

Do not remove the old domain webmaster tools before all the results of the SERP changes for a new domain

Note here, before moving to a new webmaster tools for your domain, you need to do some things, such as:

1. Done 301 redirect the old blog to new blog (redirect).

2. Try all the sites that put your blog link on her blog (a link exchange activity, for example) to change the link to your blog. If you are unable to do so in search of new backlinks pointing to your blog, as much as possible (minimum amount equal to the number of backlinks to your old domain)

3. Try your backlinks again in the future is really identifying your new domain.

4. Read the complete instructions for changes to domain names

5. This tutorial only applies to top-level domains like. Co.. Net,. Org. Tk, and so on. Users, subdomains, such as blogspot, cc.cc, web.id, co.id, or.id, ac.id, co.cc and so on, so you can not do this tutorial.

For more info: http://drshns.blogspot.com/

Saturday, June 4, 2011

SEO Strategies: Search Intelligence


Search engine optimization strategies are a shovel, but what is rare is a radically different approach to the classification as well. The other, by their very nature (eg, Google) is looking for fresh content and high quality. I see tons of marketing and a chat days scraping and automotive and blogs for the use of tactics to optimize these "fake site" may hit a place in the ranks temporarily, it will inevitably be short-lived. Real intelligence search engine comes to questions like:

How do I create content that people benefit from link?

What is a radically different approach that leads to a topic that buzz around the net for people interested? Outlook has the power.

What can I publish my blog for most of the free people who would put a $ 10 e-book?

What are people searching for my niche or a main subject and how do I implement the response and the rank for the keywords?

While good SEO is to make the basics a bit and link building technique, it is also the intelligence, imagination and pure creativity. I ranked better in search engines by contacting SEO from a perspective radically different in many cases. Ask yourself these 4 key questions is a surefire way to generate massive quality to rank well in search engines.

5 Strategies For SEO For Bloggers In 2011

Search Engine Optimization has become an Internet phenomenon. Now days, you can not have a website or blog does not do SEO, and expect to rank and engines. While big brother, Google will continue to "permaflux," which is a term used to describe the algorithm infamous uniform fine-tuning of SEOs and webmasters are standing in front of the tools and intelligence to try to figure out what tactics edge to help them get more visibility on search engines. The story of this battle, the edge goes a bit 'like this:

Google makes changes to the search ranking factors

Understand SEO

Start using SEO and ranking well because

Human nature comes into play: SEO and webmasters motor start abusing the classification system

The continuous permaflux

Today, bloggers are not only more bloggers, they are web entrepreneurs. They sell books, programs, courses, their own business and other people's stuff. They hear the coach, counselor, and some of them speak in public. Because blogging has become the blogging / making a full time income for many people, bloggers realize that their game engine has to improve.

I spent some time working as an SEO for a company in eastern Pennsylvania for a year and because it is in my nature to take things apart and see how I looked SEO structure - which holds everything together - and the lessons definitive strategies to rank websites . We all know that SEO basic right? To optimize the site, links and quality content, but is that enough is enough? Taking into account the new additions to the algorithm of older siblings, the social influence factor and some experiences I made it myself, here are five strategies search engine for bloggers disease dominate search results in 2011 and it sure they are doing enough.