Posts: 199
Threads: 102
Joined: Jul 2006
Hi everyone,
Ok, I rethink about this parsing RSS feed as new content issue. With pretty a lots of webmaster parsing those feeds, it could speel some "duplicate content" for the sites aren't they?? What if there are a bunch of sites parsing the same RSS feeds? Wouldn't these sites get penalized for having same and duplicated content??
Anyone facing this potential issues??
Posts: 374
Threads: 35
Joined: Jul 2006
Absolutely can lead to duplicate content penalties. However the serps are far from perfect, so it is possible you will end up as the authority for some pages and duplcites for every. I have had RSS feeds get ranked and get hits from search engines so it is certainly possible.
Posts: 199
Threads: 102
Joined: Jul 2006
Ok, I am getting little crazy about this idea :-
1. If I have 10sites/domains hosted under same IP (my own server, with data center in US)
2. I parse same RSS feeds into the sites altogether.
3. Each RSS feeds pased will be having url friendly path. eg :-
- mydomain1.com/RSS-updates.html
- mydomain2.com/RSS-updates.html
- mydomain3.com/RSS-updates.html
- ....and so on for the rest of sites
Would SE drop these sites? and ban them? or just treat them as another url and index them??
I know this seems very crazy/nuts, I am cracking my head for next experiment.:p
Posts: 374
Threads: 35
Joined: Jul 2006
What are you trying to accomplish? I think making 10 duplicate sites on the same IP would just be asking for trouble. If you want to do that, I would invest in seperate shared hosting for each site, or at least get individual ips for each site.
Posts: 199
Threads: 102
Joined: Jul 2006
Hi guys and gals,
Been busy latey testing out several things:p. Pertaining to the potential "harm" in duplicated contents via RSS parsing, how do you guys actually go about avioding this?? and by the way, where could you find a few hundreds of different RSS feeds?? I know yahoo has lots of RSS, but how many could it goes??
I am going to run some crazy experiments soon.:p
Posts: 171
Threads: 9
Joined: Jul 2006
ok,
this is what i would suggest.
dont do 10 exact same blogs. you are seriously asking to be blacklisted.
you can however have a 'health' portal and post 5 blogs (either sub domain, or sub directory).
each subdomain or subdir will need to have the keyword featured prominently, eg.
cancer.health.com or health.com/health for SERP purposes.
then you can do RSS scraping using RSStoBlog or another tool.
here's the important part! make sure you have 5 DIFFERENT niches.
for health, you might have cancer, diabetes, high blood pressure, etc.
if you scrape google news and health news, you can use some overlapping content.
--
there might be some content duplication penalties from google, but if you scrape correctly, it shouldnt be a major issue.
ps: if you have any major internet marketing issues to pose, you can post a comment on my blog. i'm always looking for questions/problems to blog on!
heh.