Geeks help please re robots.txt
Welcome to My Used Panty Portal! › Forums › General Conversation › Off Panty Topic › Geeks help please re robots.txt
- This topic has 10 replies, 1 voice, and was last updated 16 years, 2 months ago by anni.
-
AuthorPosts
-
November 5, 2008 at 2:23 pm #24031SerenaMember
Is robots.txt recommended?
I’m reading that could complicate things…Cheers Geeks
S
xNovember 5, 2008 at 3:24 pm #24033adminKeymasterYes, you should have a robots.txt on your site. Search engines are supposed to look for it before indexing your site and may leave if they don’t find one. The main reason is still to keep spiders out of areas you don’t want them. The file doesn’t need to be complicated, its just a few allow and disallow statements.
-Pk
November 5, 2008 at 3:30 pm #24035SerenaMemberThank u PK
Shall I stick it on the top level?November 5, 2008 at 3:34 pm #24036adminKeymasterYes, it should be in your home or public_html folder (wherever your main index page is).
-PK
November 5, 2008 at 8:20 pm #24039AnnaMemberI’ve had one of these on my Pro-domme site for a long time now. In fact I reckon most sellers should have one too – just in case.
November 5, 2008 at 8:23 pm #24040AnnaMemberI posted this on a pro-domme forum a while back so it’ll refer to PDs obviously but it is just as valid for panty sellers.
The links won’t work because I have just copied and pasted it from the other forum.
Hello Ladies
Some of you may already know this…so apologies if I’m suggesting that you suck eggs.
As you all know there have been an awful lot of outings recently in the Press. I have the horrible feeling that the Max Mosely exposé has increased the profile of Dominatrices and we may get some more attention.
I know of a couple of people on IC who have reported some very strange requests for sessions…and have subsequently pulled their £ profiles.
Also our Lady P has been subject to some unwanted attention.
Threat or actual exposure usually results in the Mistress taking down her website. Unfortunately all this does is protect your bandwidth from rubberneckers. It doesn’t protect your privacy.
For example:
Mistress Abi (the PD who exposed Max Moseley) – [old link] This online newspaper refers to an internet archive. http://www.archive.org/index.php They describe how they have accessed her site and her images using it to report on her words.Want to see Mistress Abi’s site? Any one of those dated links will take you to her site after every update. Sometimes the photos will have broken links, sometimes they will still be there for people to identify you…or print in the rags.
Edit – curiously the week after I posted this Mistress Abi’s site had a robots.txt added
Pop your own url in at the top of that page and see your own site in all its reinventions.To explain
Wayback uses a robot to access your site and take copies of all your pages and images. It then archives it after 6 months or so. To stop this robot we need to send up a file so that the robot knows to ignore our site. It *will* ignore it. This file will also delete any pages already archived.
You can prevent this. For example:
My site – http://web.archive.org/web/*/http://www.ladyanna.co.uk
😀
If I get a sniff of exposure I can pull my site and *nobody* can then access my materials…unless of course they did it all before …but how many journos would think we know how to protect ourselves… 😉
This is what you need to do.
1. Create a .txt file.
You do this by opening up Notepad. Notepad is found via Start – All Programs – Accessories – Notepad.
Once open you need to copy and paste this into the window.User-agent: ia_archiver
Disallow: /Save your text file as ‘robots’ (check this, it has to be robots.txt, I was wrestling with robot.txt for ages) and save it into your website folder on your harddrive.
Using your FTP agent you need to send up this robots.txt file to the same folder that your index.html file is in. Your index file is the first page someone gets when they enter your url. You may have named it home.html for example.
That’s it.
To check it works go back to Wayback and submit your url. You should get a nice exclusion screen.
If you get stuck, give me a poke.
Oh, one thing. This advice is for those who have their own independent sites. I don’t use moonfruit for example so I don’t know how you can send a .txt file up to them.
Hope this helps
November 5, 2008 at 10:12 pm #24041SerenaMembervery useful
thank you anna xxNovember 5, 2008 at 10:50 pm #24043anniMemberWhat about sites that have lapsed payment like my old url? I see it up there but how do I change it?
November 5, 2008 at 11:57 pm #24044AnnaMemberAnnilyn Lily wrote:
What about sites that have lapsed payment like my old url? I see it up there but how do I change it?
You can only change it if you have access and I suppose you can only have access whilst the payments are uptodate.
Dunno Anni, might be too late now.
November 6, 2008 at 12:52 am #24045adminKeymasterAnnilyn Lily wrote:
What about sites that have lapsed payment like my old url? I see it up there but how do I change it?
If you can’t upload files to the old domain there isn’t anything that you can do other then register it again or to write a nice email to the archive service and ask if they could delete it for you. I have no idea if they would or not, but it wouldn’t hurt to try.
🙂 PK
November 6, 2008 at 11:39 am #24048anniMemberShucks. Thanks, guys.
-
AuthorPosts
- You must be logged in to reply to this topic.