Jump to content
Killersites Community
bongy0

Temporary Domain

Recommended Posts

Hi

 

Recently someone built me a Wpress site, but before it was complete, he emailed me a link which wasn't the proper domain name, just to check the site was ok while he was still working on it. So it was a kind of dummy URL that would be impossible for anyone to locate, which was the idea as the site was not completed yet.

 

What I would like to know is, how, when I am building a site, in Wpress or even Dreamweaver, how do I use a temporary URL in the same way, so when I am building it, I can send my client to this temporary url to get his feedback before I transfer the site to the correct Domain and how is that transfer done?

 

I may not be using the correct terminology here but you get my point and can help me.

 

Thanks

 

Dene

Share this post


Link to post
Share on other sites

Your host provider can help you with that. There are times when I get a new client I would set up a temp url via the host's control panel and do my projects there until it's completed. I would then move it over to the correct domain.

Share this post


Link to post
Share on other sites

I have a folder in my domain called projects. Whatever I build goes into there until its done. Just upload your clients stuff there to show them. This way your not possibly tarnishing their domain with unfinished work in googles eyes.

Share this post


Link to post
Share on other sites

I basically do what Eric does. I just have a sub domain called testing. Just a different name. Then to add to what Eric said what you do is tell your client to go here to see there site until it goes live. Something like http: //www.whateveryourdomainis/nameoffolderorsubdomin

Edited by grabenair

Share this post


Link to post
Share on other sites

Either way will work. Just keep in mind that you don't want the search engines to index the temp url. My suggestion is to use a robot.txt file to disallow search engines from indexing the pages.

Share this post


Link to post
Share on other sites

Either way will work. Just keep in mind that you don't want the search engines to index the temp url. My suggestion is to use a robot.txt file to disallow search engines from indexing the pages.

 

Hi Eddie,

Do you know in what instances the temp url might be indexed?

Like for example if it was called index.html inside of a folder, could it potentially be indexed?

Would a password protected folder be more useful in preventing indexing by search engines?

How does the robot.txt solution work?

 

Thnx

Edited by Stephenius

Share this post


Link to post
Share on other sites

If you set up a temp folder or temp url the odds are that they will not be indexed if you don't post any related urls at places like this forum. Many developers here will provide a url with spaces in them indicating that you need to copy and paste the url and remove those spaces in order to view the page.

 

Again, robots.txt will work well in preventing pages from being listed but it's not 100% full proof since some lesser known search engines will ignore it thus may end up on other major search engines. This only happened to me once but that was probably because that I had the temp site up for such a long time (over a year) before taking it down.

 

The password method will probably be the best bet to prevent pages from being indexed by search engines. Keep in mind that you will need to provide the login information to whomever you want to view the temp site.

Edited by Eddie

Share this post


Link to post
Share on other sites

I forgot to add to my recent post the robots.txt code you can use.

 

# go away

User-agent: *

Disallow: /temp_folder/

 

You add this to the root. Change temp_folder to whatever you folder name is that you are trying to block.

 

Google robots.txt and you will find more information on this.

Share this post


Link to post
Share on other sites

Google will only find things as it follows links just like us. Google cannot just look in your folders and index things. I'm 99% sure that's true.

Edited by Eric

Share this post


Link to post
Share on other sites

Google will only find things as it follows links just like us. Google cannot just look in your folders and index things. I'm 99% sure that's true.

 

Yes that's correct. Googlebot finds pages in two ways: through an add URL form, www.google.com/addurl.html, and through finding links by crawling the web.

Share this post


Link to post
Share on other sites

Sorry, to understand better, by crawling the web --> does this mean look for index.html or index.php in all root directories of all registered domain names?

Also, what about registered sub-domain names, where do they fit in in all of this?

Share this post


Link to post
Share on other sites

The password method will probably be the best bet to prevent pages from being indexed by search engines. Keep in mind that you will need to provide the login information to whomever you want to view the temp site.

 

Hi Eddie,

Thnx for the reply.

Are you saying that crawlers, etc cannot access these folders either? [because I wasn't really sure about this]

Thnx

Edited by Stephenius

Share this post


Link to post
Share on other sites

I forgot to add to my recent post the robots.txt code you can use.

 

# go away

User-agent: *

Disallow: /temp_folder/

 

You add this to the root. Change temp_folder to whatever you folder name is that you are trying to block.

 

Google robots.txt and you will find more information on this.

 

Thanks, this seems like a prudent step.

Share this post


Link to post
Share on other sites

Sorry, to understand better, by crawling the web --> does this mean look for index.html or index.php in all root directories of all registered domain names?

Also, what about registered sub-domain names, where do they fit in in all of this?

 

Yes. As long as there is a link to that page (for example: index.html or index.php) googlebot (google's search engine) can find it and index that page. Sub-domains are consider separate websites by google, however they can still be crawled and indexed just like any other website.

 

Hi Eddie,

Thnx for the reply.

Are you saying that crawlers, etc cannot access these folders either? [because I wasn't really sure about this]

Thnx

 

Yes. According to Google's own website: If you need to keep confidential content on your server, save it in a password-protected directory. Googlebot and other spiders won't be able to access the content. This is the simplest and most effective way to prevent Googlebot and other spiders from crawling and indexing content on your site.

 

Hope that helps. :)

  • Upvote 1

Share this post


Link to post
Share on other sites

Yes. As long as there is a link to that page (for example: index.html or index.php) googlebot (google's search engine) can find it and index that page.

So, say I created an e-commerce store for somebody & they had no external online links to their site (because it is brand new), googlebot would not be able to find them?

 

Yes. According to Google's own website: If you need to keep confidential content on your server, save it in a password-protected directory. Googlebot and other spiders won't be able to access the content. This is the simplest and most effective way to prevent Googlebot and other spiders from crawling and indexing content on your site.

Hope that helps. :)

 

Great answer, cheers:clap:

Edited by Stephenius

Share this post


Link to post
Share on other sites

So, say I created an e-commerce store for somebody & they had no external online links to their site (because it is brand new), googlebot would not be able to find them?

 

I think you meant to say internal or inbound links (pointing toward your website). Theoretically yes that could be true, however it is not practical. Many people will find brand new websites for a variety of reasons (they may be hackers, they may be directory sites, they may just like your website and want to link to it).

 

If you want to be sure the search engines don't index your website, password protect it or at the very least use a robot.txt asking the search engines not to index your website.

Share this post


Link to post
Share on other sites

I think you meant to say internal or inbound links (pointing toward your website). Theoretically yes that could be true, however it is not practical. Many people will find brand new websites for a variety of reasons (they may be hackers, they may be directory sites, they may just like your website and want to link to it).

 

If you want to be sure the search engines don't index your website, password protect it or at the very least use a robot.txt asking the search engines not to index your website.

 

Okay, thanks for all your explanations.

Cheers,

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×