Jump to content
Jack_mcs

SEO Assistant

Recommended Posts

You do have to select the database.


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites
Tonight I noticed that the get index position for yahoo failed to open the remote file (google and msn are ok).

 

Anyone else noticed this?

I just tried it here and it worked OJ. Maybe yahoo was just busy when you tried it?

 

Jack

Share this post


Link to post
Share on other sites
I just tried it here and it worked OJ. Maybe yahoo was just busy when you tried it?

 

Jack

Hello Jack

 

Yes, it is working again. Pretty odd though, and I did try it a few times.


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites

Jack, great contribution! I just noticed one thing, it may even be mentioned in the forum, but when you use the index position with (show results), and get the results back, it looks like the domain names are truncated. Other then that, it's fabulous!

Share this post


Link to post
Share on other sites

Share this post


Link to post
Share on other sites
There seems to be something going on with google since I am seeing the same thing. I put in one of my sites, which consistently rank at position 1 or 2 for all three search engines and it doesn't show up on google. If I try to view the source on google the browser hangs and reports errors. So they are changing, or maybe have changed, how they list their results. Until that settles down and their code can be checked there's nothing to be done.

 

Jack

 

Hi Jack

Just using the SEO tool tonight which has been working great up until now but the same problem as above seems to have returned with Google is it possible they have again changed how they list their results.

My site: http://designer-fashion-accessories.com

search term: Chanel Designers Scarf

 

is at number 2 for the search term thanks to a number of your contributions and tips but today is no longer in the SEO Assistant- Index Position, search tool possibly when you have time you could run a check

Thanks


To improve is to change; to be perfect is to change often.

 

Share this post


Link to post
Share on other sites

Locate the line (around position 12) starting with $condition in the admin/includes/modules/seo_google_position.php file and change it to

$conditions = "<cite>(.*)</cite>";

 

Jack

Share this post


Link to post
Share on other sites

Hi

That was a quick reply :D

Working perfect again Thanks a lot.

John


To improve is to change; to be perfect is to change often.

 

Share this post


Link to post
Share on other sites

Hello Jack (or anyone that supports the idea)

 

The following url leads to an article that talks about Google saying that no sites will be penalized for duplicate content but they will be affected. After looking in Webmaster Tools in my Google account I found that Google flagged several pages for duplicate content, titles and meta tags.

 

http://www.lilengine.com/news/google-dupli...this-issue-480/

 

How can this be fixed? Is there any way to use Header Tags Controller to avoid duplicate content?

 

Sounds like a job for Jack! Can a contribution be made to avoid duplicate content?

 

Thanks for any thoughts and comments.

Share this post


Link to post
Share on other sites

Well the way I understood the point behind jacks contribution was to avoid these problems ( Duplicate content) within OSC.

 

It works.

 

You can virtually control every page ( couple Pseudo of pages left left out :rolleyes:) and give them individual name meta tags: etc... with this contribution avoiding these problems with Google.

 

For OSC users who would rather have a WWW as start instead of http:// please remember to do a rewrite ( Add TO htaccess ) in the htaccess....

which should look something like this

 

Options +FollowSymLinks

RewriteEngine on

RewriteCond %{HTTP_HOST} ^my-site\.com

RewriteRule ^(.*)$ http://www.my-site.com/$1 [R=permanent,L]

 

The above example is when your site is www.my-site.com

 

if you would rather have http://my-site.com

 

Options +FollowSymLinks

RewriteEngine on

RewriteCond %{HTTP_HOST} ^my-site\.com

RewriteRule ^(.*)$ http://my-site.com/$1 [R=permanent,L]

 

Then go to Google webmaster and verify

 

 

Hello Jack (or anyone that supports the idea)

 

The following url leads to an article that talks about Google saying that no sites will be penalized for duplicate content but they will be affected. After looking in Webmaster Tools in my Google account I found that Google flagged several pages for duplicate content, titles and meta tags.

 

http://www.lilengine.com/news/google-dupli...this-issue-480/

 

How can this be fixed? Is there any way to use Header Tags Controller to avoid duplicate content?

 

Sounds like a job for Jack! Can a contribution be made to avoid duplicate content?

 

Thanks for any thoughts and comments.

Edited by joli1811

To improve is to change; to be perfect is to change often.

 

Share this post


Link to post
Share on other sites
Hello Jack (or anyone that supports the idea)

 

The following url leads to an article that talks about Google saying that no sites will be penalized for duplicate content but they will be affected. After looking in Webmaster Tools in my Google account I found that Google flagged several pages for duplicate content, titles and meta tags.

 

http://www.lilengine.com/news/google-dupli...this-issue-480/

 

How can this be fixed? Is there any way to use Header Tags Controller to avoid duplicate content?

 

Sounds like a job for Jack! Can a contribution be made to avoid duplicate content?

 

Thanks for any thoughts and comments.

There's a thread here dealing with this subject and a contribution has come about due to it. Although, as I mentioned in that thread, I think it is a non-issue. If having the other pages listed as not part of the group of duplicate content pages is important to you, then you would need to add something to those pages to prevent that. But that's not a subject for this thread.

 

Jack

Share this post


Link to post
Share on other sites
There's a thread here dealing with this subject and a contribution has come about due to it. Although, as I mentioned in that thread, I think it is a non-issue. If having the other pages listed as not part of the group of duplicate content pages is important to you, then you would need to add something to those pages to prevent that. But that's not a subject for this thread.

 

Jack

Thanks for the reply. I was simply giving the link to the article to get a better understanding for myself and others who may be intimidated by Google's policies on duplicate content. After installing this contribution and Header Tags SEO, I'm more comfortable about the content of my pages. I believe it solved my duplicate content issue. One thing is for sure, SEO is an going process. Maybe after Googlebot crawls my site again some of the flagged pages would get 'de-flagged'.

 

The contribution you referred me to is kind of vague in its purpose to me or my lack of knowledge of SEO could be the factor. I may install and draw my own conclusion. I installed SEO Assistant and none of my pages are in the supplimental index so I believe my rankings in search results have suffered due to Lack of content and lack of keyword distribution.

 

I installed both SEO Assistant & Header Tags SEO. Great work.

 

Please forgive me for sounding like a dunce in my previous post. I was just alarmed. Thanks for responding.

Share this post


Link to post
Share on other sites
Thanks for the reply. I was simply giving the link to the article to get a better understanding for myself and others who may be intimidated by Google's policies on duplicate content. After installing this contribution and Header Tags SEO, I'm more comfortable about the content of my pages. I believe it solved my duplicate content issue. One thing is for sure, SEO is an going process. Maybe after Googlebot crawls my site again some of the flagged pages would get 'de-flagged'.

 

The contribution you referred me to is kind of vague in its purpose to me or my lack of knowledge of SEO could be the factor. I may install and draw my own conclusion. I installed SEO Assistant and none of my pages are in the supplimental index so I believe my rankings in search results have suffered due to Lack of content and lack of keyword distribution.

 

I installed both SEO Assistant & Header Tags SEO. Great work.

 

Please forgive me for sounding like a dunce in my previous post. I was just alarmed. Thanks for responding.

There's no need to apologize at all and I am sorry if my reply made you feel that way. Posting the article was helpful. I follow such things but hadn't seen that one yet. Actually, others had already requested such a feature from Header Tags and, after thinking about it, especially in light of what google has said about duplicate content, I will be adding a feature in Header Tags to, hopefully, help with that.

 

Pages shouldn't be in the supplemental index,so that's a good thing.

 

I wasn't promoting that contribution I referenced. I haven't looked at and I'm not aware of any results from it so I can't say if it is worth it or not. I wouldn't suggest installing a contribution just because of a perceived problem. You may want to wait a few months on it and see what othes say.

 

Jack

Share this post


Link to post
Share on other sites
There's no need to apologize at all and I am sorry if my reply made you feel that way. Posting the article was helpful. I follow such things but hadn't seen that one yet. Actually, others had already requested such a feature from Header Tags and, after thinking about it, especially in light of what google has said about duplicate content, I will be adding a feature in Header Tags to, hopefully, help with that.

 

Pages shouldn't be in the supplemental index,so that's a good thing.

 

I wasn't promoting that contribution I referenced. I haven't looked at and I'm not aware of any results from it so I can't say if it is worth it or not. I wouldn't suggest installing a contribution just because of a perceived problem. You may want to wait a few months on it and see what othes say.

 

Jack

Adding that feature would be great and make others feel better. Google being the leading SE, most SEO efforts and modifications are built around Google's policies. Maybe one day you could place a feature in SEO Assistant first that could find which pages are flagged by Google as Duplicate Content.

 

This may help others:

 

Log in to your Google account if you have one and go to Webmaster tools.

 

Click on the website for which you have submitted a sitemap.

 

Once there, click on 'content analysis'.

 

If you have any pages flagged for duplicate content, they would be found there.

 

 

At the moment I have pages flagged for duplicate meta descriptions, short meta descriptions, missing title tags, and duplicate title tags.

 

I included this information so it may help someone solve these isses they may be experiencing.

 

Also, hopefully Jack can add features to both Header Tags SEO and SEO Assistant to help us out with these issues.

 

Thanks for any help and comments.

Edited by discxpress

Share this post


Link to post
Share on other sites

It's better to post tips like this in the tricks and tips forum since more people would be likely to see it.

 

As for SEO Assistant and duplicate content pages, there's no simple way to check that via the contribution since it is not something google lists. A few checks can be done via the code and the difference may be duplicate content pages but they may not be either so I don't see that that provides much useful information, whch is why I haven't added it. The supplemental pages option can be used for about the same purpose though.

 

Jack

Share this post


Link to post
Share on other sites
It's better to post tips like this in the tricks and tips forum since more people would be likely to see it.

 

As for SEO Assistant and duplicate content pages, there's no simple way to check that via the contribution since it is not something google lists. A few checks can be done via the code and the difference may be duplicate content pages but they may not be either so I don't see that that provides much useful information, whch is why I haven't added it. The supplemental pages option can be used for about the same purpose though.

 

Jack

 

Hi Jack

since today I am getting a new error with Google,Yahoo and msn

 

Unable to open remote file http://www.google.com/search?as_q=Yamaha+R...ges&start=0.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxxx/public_html/xxxx/admin/includes/modules/seo_position.php on line 110

 

 

Unable to open remote file http://search.msn.com/results.aspx?q=Yamah...=0&count=20.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxxx/public_html/xxxxxxxx/admin/includes/modules/seo_position.php on line 110

 

 

Unable to open remote file http://search.yahoo.com/search?_adv_prop=w...mp;n=100&b=.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxx/public_html/xxxxx/admin/includes/modules/seo_position.php on line 110

 

Do you think this is temporary??

 

Any advice

Thanks John


To improve is to change; to be perfect is to change often.

 

Share this post


Link to post
Share on other sites
Hi Jack

since today I am getting a new error with Google,Yahoo and msn

 

Unable to open remote file http://www.google.com/search?as_q=Yamaha+R...ges&start=0.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxxx/public_html/xxxx/admin/includes/modules/seo_position.php on line 110

 

 

Unable to open remote file http://search.msn.com/results.aspx?q=Yamah...=0&count=20.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxxx/public_html/xxxxxxxx/admin/includes/modules/seo_position.php on line 110

 

 

Unable to open remote file http://search.yahoo.com/search?_adv_prop=w...mp;n=100&b=.

Warning: fclose(): supplied argument is not a valid stream resource in /home/xxxxx/public_html/xxxxx/admin/includes/modules/seo_position.php on line 110

 

Do you think this is temporary??

 

Any advice

Thanks John

Probably it is your web host who has stuffed you up (as has mine).


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites

Hello,

 

I've installed SEO assistant, but it is seems that is not working fine.

 

When I click on Page Rank after have typed my domain "www.plixx.com.br" the following error appears:

 

"Failed to read url: www.plixx.com.br"

 

When I click on Get Position using the search term grampeador c240a for which my domain www.plixx.com.br is the first on Google, the following message appears:

 

The site www.plixx.com.br is not in the top 10 for the term grampeador c240a on Google

 

Do you have some tip?

 

Thank you in advance

Share this post


Link to post
Share on other sites

The PR problem is probably due to the .com.br. I will have to look into that. But index position works correctly when I try it here.

 

Jack

Share this post


Link to post
Share on other sites
Hello,

 

I've installed SEO assistant, but it is seems that is not working fine.

 

When I click on Page Rank after have typed my domain "www.plixx.com.br" the following error appears:

 

"Failed to read url: www.plixx.com.br"

 

When I click on Get Position using the search term grampeador c240a for which my domain www.plixx.com.br is the first on Google, the following message appears:

 

The site www.plixx.com.br is not in the top 10 for the term grampeador c240a on Google

 

Do you have some tip?

 

Thank you in advance

probably your webhost, ask them.


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites
The PR problem is probably due to the .com.br. I will have to look into that. But index position works correctly when I try it here.

 

Jack

 

Hi Jack,

 

I would thank you if you could look in this ".com.br" problem.

 

Paulo

Share this post


Link to post
Share on other sites
Hi Jack,

 

I would thank you if you could look in this ".com.br" problem.

 

Paulo

Mine does the same since the web host screwed with fopen.


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites
Mine does the same since the web host screwed with fopen.

 

I talked to my host staff. They told me there are no restrictions about fopen. Inclusively when I search for the page rank of my host's domain "www.centralserver.com.br" there is no error.

Share this post


Link to post
Share on other sites

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×