[Gloucs] Re: Web site building and ranking

Clare Elliott gloucs at mailman.lug.org.uk
Tue Jan 7 08:02:01 2003


There are certainly braille readers that can work with standard Internet =
Exploder - I used to do support for people using them - combined with =
tab navigation through links it gave a pretty fast (and good, according =
to the people using them) browsing experience.=20

It actually worked with any 'windowed' application (never seen one =
working under Linux, though), but I never tried it with any other =
browser.

Clare

> -----Original Message-----
> From: Guy Edwards [mailto:guy_j_edwards@hotpop.com]
> Sent: 06 January 2003 17:02
> To: MAILING LIST
> Subject: [Gloucs] Re: Web site building and ranking
>=20
>=20
> I know it's normally rude to post a private mail to a list but it's
> relevant and I've removed anything personal.
>=20
> On Mon, 2003-01-06 at 11:03, bjh wrote:
> > This is starting to get to be an interesting exchange and the=20
> > implications for building web sites for blind persons to see=20
> > are normally totally ignored by most site builders...
>=20
> I think the reading devices used by the blind are either voice
> synthesizers or braille screen readers, I don't know for sure. Either
> way I thought they used a basic input from a text based browser, or
> variation upon that. I know there was an article in LinuxFormat a long
> time ago saying that SUSE could now be installed and run by a blind
> person as they had added braille reader support to the SUSE install
> right at the start.
>=20
> If no-one minds me asking, is there anyone on the list using a braille
> or voice device to surf?
>=20
> > A possibly useful site to look at for this purpose is=20
> > http://vision.unco.edu/edtech/primer/ which examples effects of=20
> > a number of factors we have been talking about - may be=20
> worth a look=20
> > for you...
>=20
> The link is good. Here's a snippet of something I hadn't thought of...
>=20
> "White-on-white is a technique that provides extra help for users who
> may require additional information. While sighted people are easily
> distracted by a lot of extra words on a page, blind people often need
> some extra information to help in navigation or to understand what is
> happening. We put that extra information in a font that is the same
> color as the background. Since most of our web pages have a white
> background, we call this technique white-on-white. It would=20
> work just as
> well in other colors."=20
>=20
> [ snipped ]
>=20
> You've been coding 16x longer than I have :-) I've just remembered who
> you are too. Is the date on http://www.gloucs.lug.org.uk/about.php
> correct? E.g. Was the LUG founded in June 2000? I meant to ask many
> months ago.=20
>=20
> I understand html has changed a lot but most of the massive early
> changes were due to the browser wars, whereas now, with xml=20
> creeping in
> and browsers priding themselves more in being bug free and standards
> compliant, I really think it's getting better.
>=20
> > Putting it into context - a site can be technically perfect=20
> in terms of=20
> > minimal html coding etc., but if it is never found, is a=20
> total waste of=20
> > precious time and resources (time is money) taken to put it=20
> together.
>=20
> Again, we have different audiences. With an Open Source=20
> project I would
> rather have a technically perfect site full of content, as it will be
> the program users seeking information, help, or discussion. I would
> quite happily sacrifice search engine rankings for that.
>=20
> With Open Source you aren't trying to attract customers - that's a by
> product of good effort. Linus was just having fun when he=20
> made his modem
> terminal, and everyone else joined in. He wasn't thinking=20
> about stealing
> market share from Microsoft or replacing Unix in in corporations.
>=20
> I wouldn't worry that an Open Source project isn't attracting=20
> new users
> through its search engine ratings. I would worry that it didn't have
> ease of use, reliable documentation or stable code. If you make
> something half decent and put the effort into your project, people
> appear from nowhere. I use mailing lists, newsgroups,=20
> websites and lots
> of 2 way emailing (eg conversations with users who have asked=20
> for help).
>=20
> > Technical computer books are not necessarily written by=20
> persons qualified by the=20
> > practical experience of the reality of using the facility,=20
> in fact many books=20
> > unfortunately spread complete nonsense, because the writer=20
> will use other=20
> > previously written material and regurgitate the information=20
> as FACT without=20
> > extensive TESTING on the net of the subject matter to see=20
> if it really
> > does work.
>=20
> I agree, I have a "teach yourself" book on html and xhtml but=20
> the xhtml
> additions are a complete afterthought and most of the code will never
> get through a validator. This is more a problem of tracking down the
> right book (or other resource) I feel though. that and knowing which
> bits to igore and which bits to remember.
>=20
> This next bit might be of use to people so:
>=20
> > Suggestion - based on experience - made to help our=20
> reader(s) (smile),=20
> > and save them months of wasted time, who may be thinking of=20
> building a=20
> > site or page for the first time - is:
> >=20
> > Sit down with a piece of paper and pencil, decide on every=20
> possible key=20
> > word for the intended site which you think an internet=20
> browser may just=20
> > put into a search engine to find the subject matter...
> [snip]
> > Then go to the computer and submit each of those words (and phrases=20
> > or small group of words) into a number of different main=20
> target search engines...
> >=20
> > Following each separate search, click on the number one=20
> site, and with=20
> > the site open on the screen, click on View on the browser=20
> header bar,=20
> > and then source, this opens all the html coding on screen=20
> used to build=20
> > the page...=20
> > you are now looking for each (if any) meta tagged keyword and=20
> > the actual order the keywords are listed - then click back=20
> on to normal=20
> > screen and check to see where your own selected search=20
> words were actually=20
> > visible on the page as text... - RECORD THE SEARCH ENGINE=20
> and INFORMATION=20
> > - YOU NOW KNOW THE FACTUAL INFORMATION TO BUILD YOUR SITE=20
> for that given=20
> > search engine and order of search words to achieve a high ranking=20
> > position... (ignore all references to graphics when doing=20
> this exercise!).=20
> > Only at this point can you even think about how a site is=20
> going to be built=20
> > to make it as visually attractive as possible to appeal to=20
> the viewer when=20
> > he/she has found it... (Perhaps with very careful use of=20
> colour alone and=20
> > the MINIMAL amount of text on each page to create the=20
> interest from the=20
> > reader without boring and also to achieve the search engine ranking=20
> > required - remember, each additional NON keyword will=20
> normally dilute=20
> > the effect of the keywords for ranking purposes - it is=20
> about achieving=20
> > balanced copy without trying to put a book on the page (like this=20
> > email is starting to look like - smile)...
>=20
> I thought the bot did more than just grep (not grep, but you=20
> know what I
> mean) the page for words. There's some kind of html parsing going on.
>=20
> > You then proceed and carry out the lengthy exercise for=20
> each of the target=20
> > search engines and slowly a picture will develop for you to=20
> move to the=20
> > final stage of actually building your webpage for that=20
> given set of circumstances=20
> > - it may be that in order to get certain search engines to=20
> rank your site that=20
> > you will have to build different pages with basically the=20
> same information,=20
> > but structured in a totally different way so that the site=20
> will be ranked=20
> > highly by different search engines - if you have to do=20
> this, you then submit=20
> > the different pages to the specific search engine and you=20
> should achieve=20
> > utopia over a period of time - I have noticed that some=20
> search engines=20
> > will suddenly rank one of my sites anything up to two years=20
> after submission=20
> > for no particular reasons...
>=20
> I'd say because it's ranking against a lot of other sites=20
> which rise and
> fall as they're developed or removed. But there's more to it of course
> (different passes in the first and second month are done by different
> types of googlebot for instance)
>=20
> > The fastest search engine to rank is probably at present=20
> Lycos, and although=20
> > they push paid for submissions, they will rank a free=20
> submission site after=20
> > about four to six weeks! =20
> http://www.lycos.co.uk/service/addasite.html  ,=20
> > forget what they say on the site, they do rank FREE=20
> submissions quickly -=20
> > they need the information you have on your site to build=20
> their data base=20
> > and attract new customers!=20
> >=20
> **************************************************************
> ***************************
>=20
> > > Do a Google for Linux cad viewer, or Linux dwg, and the=20
> sites at the top
> > > of the search. Or do a search for Gloucester Linux , top again.
> >=20
> > *** Sorry, but that is because you are being very specific=20
> - that is Gloucester Linux will
> > only be found by someone who puts "Gloucester Linux" in the=20
> search engine at that time...
>=20
> My argument would be, if you're looking for Linux in Gloucester, it's
> the two words you type in. It's a very specific site. How do=20
> you Google
> for Linux in Gloucester without using those two words? If=20
> they're after
> "Linux Support" there's better sites out there to help them, why bring
> them to this site? They aren't going to give me money and I'm not
> providing a better resource that the bigger sites.
>=20
> > linux, help, support, information, assistance, meeting,=20
> meetings, regular, expertise,=20
> > looking, for, programmes, programs, of, message, board,=20
> glos, gloucestershire, cotswolds,=20
> > uk, up, dates, updates, need, reviews, projects, code,=20
> codes, presentations, talks, experts,=20
> > begginner, begginers, on, distros, method, methods,=20
> booting, dual, windows, with, install,=20
> > installing, installation, dual, and, easy, I, use,=20
> mandrake, suse, Suse, SuSe, red, hat,=20
> > redhat, derbian, in, member, members, group, to, (or whatever)...
>=20
> Right, for a start that capitalization doesn't matter=20
> (suse*3) as far as
> I know, and second, "on, and, I, up, for, with, need" ..etc=20
> are junk. I
> would refuse to put them in any site.
>=20
> From the present GLUG site:
>  <meta content=3D"Linux Gloucestershire GlosLUG Linux User Group
> gloucs.lug.org.uk Linux support Cheltenham Stroud Gloucester Cotswolds
> Forest open source mandrake redhat suse" name=3D"keywords" />
>                =20
> We just slapped some words in, (Just noticed Linux is in 3=20
> times - doh!)
> I think we've got the main meat of it already. My future idea would be
> to go through each of the GLUG sites PHP pages and add a set of tags
> specific for each page. Same for Lx-Viewer (I'll probably do this
> tonight).
>=20
> [snip]
>=20
> > The combination now starts to get large and the potential=20
> cachment factor huge=20
> > - use mis spelt words as meta tagged key words to get all=20
> the incorrect keyboard input!!!
>=20
> This is where I decide I don't want junk in my html. If you misspell a
> search, I'd expect the user to notice. Google will tell you, or you'll
> end up with an obviously depleted or weird search result.
>=20
> I'm not going to penalize everyone with a bucket load of html=20
> to try and
> drag in misspellings. Just a choice I can afford to make. If=20
> I was being
> paid wads of cash, I might change my mind.
>=20
> That and don't you think it's possible that the search engines may
> oneday penalize overloading of search tags? I've no idea what=20
> the limit
> might be. You point below is quite valid.
>=20
> > *** Some of us are starting to think that the Google bots=20
> will pick up=20
> > "booked mark" sites as links, the Google guys are very reluctant to=20
> > let people know how there bots really work - if they made the=20
> > information truly known we would all be using the info to fix our=20
> > listings!
>=20
> Guy
>=20
>=20
>=20
>=20
> _______________________________________________
> gloucs mailing list
> gloucs@mailman.lug.org.uk
> http://mailman.lug.org.uk/mailman/listinfo/gloucs
>=20
> This e-mail has been scanned for all viruses by Star Internet. The
> service is powered by MessageLabs. For more information on a proactive
> anti-virus service working around the clock, around the globe, visit:
> http://www.star.net.uk
> _____________________________________________________________________
>=20

This=20e-mail=20has=20been=20scanned=20for=20all=20viruses=20by=20Star=20I=
nternet.=20The
service=20is=20powered=20by=20MessageLabs.=20For=20more=20information=20on=
=20a=20proactive
anti-virus=20service=20working=20around=20the=20clock,=20around=20the=20gl=
obe,=20visit:
http://www.star.net.uk
_____________________________________________________________________