|Exporting Friends / Shared Journal comments and journals.
||[Apr. 30th, 2004|11:27 am]
LiveJournal Client Discussions
Hello, I'm working on developing a client catering to people who possess multiple journals. (For example, people that use the LJ system for role playing, and who thus have Journals for their various characters.) One of the requirements, (if this client is to be of any use at all,) is that it will allow people to seemlessly view the journals of all of their friends, subdivided by which account the journal is friended to. Anouther is that it must support collapsable comment trees.|
First of all, is it possible to fetch entries from the journal of some one other than yourself via the getevents mode? Or is there some other mode that would be more appropriate. Getevents appears to only allow you to fetch your own journal. (currently it looks like my wisest choice is to simply use the /data/atom format)
Secondly, having fetched the journal entries of the friends, and the journal entries of any shared journals, is there a mechanism for exporting the comments of those friends journals? Currently the data/rss and data/atom formats don't even appear to inform you if comments are present on a given entry. And the comment export guidelines clearly state that you can only export your own comments.
Well... There is the down side that the only mechanism I will have available for assessing whether there are new comments is HTML scraping the journal or the indevidual entries... Which might marginally increase the trafic generated by this script (especially if I let it scrape every entry. *shudder*) In any case, the script will be able to query more entries, more rapidly than most browsers, so even if it doesn't increase traffic, the traffic it creates will be more "dense" than it would be otherwise.
Does "check friends" notify you if there are new comments to existing posts, or modifications to existing posts? If so, that would vastly reduce the poling neccesary. Or does it just notify you when there are brand new posts?
Hmm... Maybe I can put in "watch" tags, so that the script will only check entries which the user has designated as important.
You can scrape the friends page when checkfriends tells you to. The user will need to have set the option that adds the "&nc=" to the links to comments. Then you can parse those out and get the number of comments on each entry.
Alternately, if you want to make it easier on everyone, you can write a custom journal style that returns pretty much nothing but the data you want (including the comment count) in XML. I did this once; it's not hard. Then you add a query to the URL to fetch the friends page, telling it to use that style (I think it's "&usestyle=nnnnn" where nnnnn is the ID number of the style. The drawback is that only paid users can fetch pages using arbitrary styles, so your tool won't work for free users.
Parsing the comments page could be hard, unless there is a way now to force the comments page to display using an arbitrary S2 style. In that case you can pull the same trick as above, defining a style that returns all the data in nice easy-to-parse XML, and adding a query to the URL to force that style.
I've thought about doing this; if you do so, please give back to the community by at least posting info about the ID number(s) of your style(s) and a schema of the XML they generate, so that other tools can use them!
I may experiment with this for testing purposes, however, it will not be useful for my main system.
the scenario I am dealing with is one in which a given user may have up to ten different journals, only one of which is actually a paid journal. Each of those journals may have mutually overlapping (yet nonidenticle) friends lists, and each of those journals may be inside of different friends groups on the journals whom they are both friends of and friends. Therefore to check friends, it is actually neccessary to log in as each of those journals... And thus, even if you have a paid account, custom styling will only reduce the checking for a single journal at most.