Geeks help please re robots.txt

Welcome to My Used Panty Portal! Forums General Conversation Off Panty Topic Geeks help please re robots.txt

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #24031
    Serena
    Member

    Is robots.txt recommended?
    I’m reading that could complicate things…

    Cheers Geeks
    S
    x

    #24033
    admin
    Keymaster

    Yes, you should have a robots.txt on your site. Search engines are supposed to look for it before indexing your site and may leave if they don’t find one. The main reason is still to keep spiders out of areas you don’t want them. The file doesn’t need to be complicated, its just a few allow and disallow statements.

    -Pk

    #24035
    Serena
    Member

    Thank u PK
    Shall I stick it on the top level?

    #24036
    admin
    Keymaster

    Yes, it should be in your home or public_html folder (wherever your main index page is).

    -PK

    #24039
    Anna
    Member

    I’ve had one of these on my Pro-domme site for a long time now. In fact I reckon most sellers should have one too – just in case.

    #24040
    Anna
    Member

    I posted this on a pro-domme forum a while back so it’ll refer to PDs obviously but it is just as valid for panty sellers.

    The links won’t work because I have just copied and pasted it from the other forum.

    Hello Ladies

    Some of you may already know this…so apologies if I’m suggesting that you suck eggs.

    As you all know there have been an awful lot of outings recently in the Press. I have the horrible feeling that the Max Mosely exposé has increased the profile of Dominatrices and we may get some more attention.

    I know of a couple of people on IC who have reported some very strange requests for sessions…and have subsequently pulled their £ profiles.

    Also our Lady P has been subject to some unwanted attention.

    Threat or actual exposure usually results in the Mistress taking down her website. Unfortunately all this does is protect your bandwidth from rubberneckers. It doesn’t protect your privacy.

    For example:
    Mistress Abi (the PD who exposed Max Moseley) – [old link] This online newspaper refers to an internet archive. http://www.archive.org/index.php They describe how they have accessed her site and her images using it to report on her words.

    Want to see Mistress Abi’s site? Any one of those dated links will take you to her site after every update. Sometimes the photos will have broken links, sometimes they will still be there for people to identify you…or print in the rags.

    Edit – curiously the week after I posted this Mistress Abi’s site had a robots.txt added

    Pop your own url in at the top of that page and see your own site in all its reinventions.

    To explain

    Wayback uses a robot to access your site and take copies of all your pages and images. It then archives it after 6 months or so. To stop this robot we need to send up a file so that the robot knows to ignore our site. It *will* ignore it. This file will also delete any pages already archived.

    You can prevent this. For example:

    My site – http://web.archive.org/web/*/http://www.ladyanna.co.uk

    😀

    If I get a sniff of exposure I can pull my site and *nobody* can then access my materials…unless of course they did it all before …but how many journos would think we know how to protect ourselves… 😉

    This is what you need to do.

    1. Create a .txt file.
    You do this by opening up Notepad. Notepad is found via Start – All Programs – Accessories – Notepad.
    Once open you need to copy and paste this into the window.

    User-agent: ia_archiver
    Disallow: /

    Save your text file as ‘robots’ (check this, it has to be robots.txt, I was wrestling with robot.txt for ages) and save it into your website folder on your harddrive.

    Using your FTP agent you need to send up this robots.txt file to the same folder that your index.html file is in. Your index file is the first page someone gets when they enter your url. You may have named it home.html for example.

    That’s it.

    To check it works go back to Wayback and submit your url. You should get a nice exclusion screen.

    If you get stuck, give me a poke.

    Oh, one thing. This advice is for those who have their own independent sites. I don’t use moonfruit for example so I don’t know how you can send a .txt file up to them.

    Hope this helps

    #24041
    Serena
    Member

    very useful
    thank you anna xx

    #24043
    anni
    Member

    What about sites that have lapsed payment like my old url? I see it up there but how do I change it?

    #24044
    Anna
    Member

    Annilyn Lily wrote:

    What about sites that have lapsed payment like my old url? I see it up there but how do I change it?

    You can only change it if you have access and I suppose you can only have access whilst the payments are uptodate.

    Dunno Anni, might be too late now.

    #24045
    admin
    Keymaster

    Annilyn Lily wrote:

    What about sites that have lapsed payment like my old url? I see it up there but how do I change it?

    If you can’t upload files to the old domain there isn’t anything that you can do other then register it again or to write a nice email to the archive service and ask if they could delete it for you. I have no idea if they would or not, but it wouldn’t hurt to try.

    🙂 PK

    #24048
    anni
    Member

    Shucks. Thanks, guys.

Viewing 11 posts - 1 through 11 (of 11 total)
  • You must be logged in to reply to this topic.
 
 
 
 
 
 
 
 
 
 
 
 

Copyright © 2025 Used Panty Portal. All Rights Reserved.