ARGGG, I am suffering from the "the more I know, the less I know" effect
of reading too much techie stuff.
I think I finally understand the usefullness of robots.txt, robot meta
tags, and .htaccess in relation to search engines and spiders. I have
no idea how to go about implementing any of these things. I am trying
to come up with an efficient yet, cost effective way to get things
straightened out. What I want to do is... Not drop lower in any search
engines and eventually increase my listing in Google (we all want that)...
As usual I need advice... (someday I will contribute something I hope)
What I need to do is:
1. Get the pages that have nothing to do with products out of the
search engines... (google has spidered my shopper login page in merchant
as and example)... that page does not need to be listed anyplace, since
it has no bearing on what I am selling. I use robots.txt for that
stuff?
2. Get rid of duplicate product pages so Google does not think they are
mirror pages. Options are:
(a.) I optimize the Merchant product pages URL's , and create a
site map of products and not have static pages.
(b.) Or keep using Static Catalog and generate product pages through
that, and use robots.txt to keep Google from spidering the pages in
merchant.
(c.) Or the do it all option... Optimize the merchant product
pages, have duplicate static pages, site map of miva products and use
robot.txt to keep Google out of the merchant pages and only let it have
the static pages?
3. A site map seems very important ( I have a lame one on site now,
that I did freebie on the net, but it only did static pages) .. so could
I use Swap Link from Bill Weiland, for the site map... but since I
already have Static Catalog Generator, do they duplicate stuff, or
cancel each other or what?
So which is it by popular opinion (a.), (b.) or (c.) I am leaning
towards (c.), but is that overkill? and is there one module out there
that will do all that?
Thank you
Linda T
http://corsetsandcostumes.com
Comment