#1
  1. No Profile Picture
    Junior Member
    Devshed Newbie (0 - 499 posts)

    Join Date
    Oct 2000
    Posts
    3
    Rep Power
    0

    HTML content extractor for XML conversion


    My employer (Museum Victoria) is beginning the process of upgrading its tens of thousands of web pages from HTML to XML. The benefits of this are numerous, but I assume everyone here knows them.

    The problem we currently face is how to get the data (content) out of the existing pages and leave us with essentially text content that preserves only the basic formatting (heading levels, font emphasising etc). I have found a number of HTML strippers, and they are great at taking out the HTML tags alone, but they don't remove any non-body text (eg navigation text etc), and they don't preserve any of the body-text formatting.

    HTML to XML converters usually just convert the page to XHTML which is not what we want. Other XML extractors don't work on more than one page, require substantial scripting, or wont run on a windows platfor, or I don't know how to automate them to work on our many thousands of pages.

    Additionally, because of the many different designs of the pages on the collection of museum sites, a great deal of time needs to be done to create filters for HTML strippers of XML converters which will work on each site. Idealy a utility which is intelligent enough to do most of the work and leave me to do the fine-tuning would be perfect!

    After many hours of searching the web, I'm starting to run out of ideas. Can anyone here help me with this challenge? I'm certain this is going to become a very widespread problem in the next year or so as content providers migrate to XML.

    Your help is greatly appreciated!

    Regards,
    Neil
  2. #2
  3. No Profile Picture
    Junior Member
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2001
    Posts
    1
    Rep Power
    0

    NoteTab Pro


    Hi Neil

    My personal favorite - I started to use the tool in 1997 - is NoteTab Pro. You can find it at http://www.notetab.com/. It is a very powerful text/HTML-editor including an easy-to-learn so called "Clip Language". You can write your macro commands in that language to prepare large numbers of files (the maximum file size of ONE file is 2Gigabytes! - enough capacity I assume )


    Do not expect NoteTab Pro to do miracles.

    There are no easy solutions to some of your requirements, though:

    a)
    , but they don't remove any non-body text (eg navigation text etc),
    The critical point here is: How to distinguish Navigation-Elements (mostly strings enclosed within <a href=...>-Tags) from "normal" Links within the "body"-text.
    Write a macro to strip (search-replace) those text-elements first.

    b)
    and they don't preserve any of the body-text formatting
    Write a macro to convert the elements to preserve to HTML-entities.

    Do the following further steps:
    c) Strip HTML from your files

    d) Reconvert HTML-entities to HTML-tags and attributes.

    e) Convert Files to XML or XHTML with the NoteTab Pro built in conversion filter

    Idealy a utility which is intelligent enough to do most of the work and leave me to do the fine-tuning would be perfect!
    The Clip Language - in my opinion - meets your requirements.

    cheers, tom

IMN logo majestic logo threadwatch logo seochat tools logo