Hi,
since I did not manage to post them on confluence
(dog slow) here are the meeting logs:
aaime: 0) What's up
***aaime is working on docs and back on wcs 1.1 (and having deep headaches with testing wfsv)
***jdeolive has been working on RC2 and playing guitar hero while waiting for builds and uploads
***arneke is battling Geo, getting Atlas ready to go again and trying to find time to work on jtilecache
***sfarber is about to start updatesequence support and then more ArcSDE raster formats (and perhaps some raster-branch poking)
***dwinslow is basically finished with the GZIP filter, still needs to be backported
***sfarber wonders if justin has seen http://www.gamerswithjobs.com/node/36460
sigq: Title: Gamers With Jobs | (at www.gamerswithjobs.com)
***groldan facing a deadline on another project
jdeolive: haha, sfarber: no i have not!!
***jdeolive is eagerly reading
***SE-Wilco2 preparing to confuse a report generator with maps
***sbenthall is trying to use versionedfeatures in Vespucci and writing an orton grant proposal when that is impossible
***aaime trembling every time sbenthall tries to touch wfsv
***SE-Wilco2 reverts aaime to last confident version
aaime: 1) Release
jdeolive: so its all ready to go pretty much
jdeolive: but what do we think about the wcs regression?
jdeolive: should we just put off?
aaime: I haven't investigated it but it should be trivial
jdeolive: well it is wierd
aaime: what about releasing 1.6.0 out of rc2 + one little tiny change?
jdeolive: the code that gets the mimeetype is just commentd out completely
jdeolive: well... i am tempted to just release because its trivial so we can sneak it out next release
jdeolive: doing now puts us a day behind
jdeolive: and i need to get back to payed work
aaime: That pretty much settles it
aaime: apparently everybody is on deadlines these days
jdeolive: that time of year i guess
aaime: Happy holidays to everybody
sbenthall: yeah, same to you
jdeolive: sooo... i will announce tommorow
aaime: I guess this pretty much settles it
jdeolive: groldan: did you get a chance to test out kml?
groldan: yup
groldan: it worked well
jdeolive: sweet
groldan: with and without labels
groldan: and with time support
aaime: Ok, so announcement tomorrow
jdeolive: yup
groldan: so andrea you managed to make the windows installer?
aaime: already uploaded afaik
jdeolive: just attaching it to the rlease now
groldan: ah, sweet
groldan: so I'll have a run over it, as I've tested my own build when the gt svn were down
jdeolive: cool, its up now
aaime: groldan: http://sourceforge.net/project/showfiles.php?group_id=25086&package_id=38410&release_id=562389
sigq: Title: SourceForge.net: Files (at sourceforge.net)
groldan: gotcha
aaime: Ok, next topic?
aaime: 2) Testing
aaime: Hem, can we discuss 3) before this one
aaime: since I guess talking about testing will open the floodgates
aaime:
jdeolive: good idea
aaime: 2) Community modules fast track
***jdeolive gets his raft
aaime ha scelto come argomento: 0) What's up 1) Release 2) Community modules fast track 3) Testing
***dwinslow gets his wetsuit
aaime: So, we have those two modules
***aaime throws away hope and waits still for the flood to come
jdeolive: so i think its agreed that we can add these into version control? we have 4 +1's
jdeolive: ?
aaime: It seems so to me
aaime: Any objection?
sfarber: The +1 is for adding them to the community section, right?
aaime: yes
sfarber: +2 from me (that makes it 5)
aaime: ha ha
jdeolive: yup, we still have yet to decide on teh entire process
aaime: Ok, but that would take some more discussion I guess
aaime: and a proposal document that integrates the various point of view
aaime: to ease up discussions
jdeolive: sounds good
aaime: Ok, next topic
aaime: 3) Testing
jdeolive: who wants to start?
aaime: Me with a provocative question
***jdeolive braces himself
aaime: Gabriel talked about the lack of real unit tests
aaime: yet GeoSErver, as made up today
aaime: is really an integrator
aaime: does not have real library componetns inside
groldan: have some
groldan: but yeah
aaime: anything that's obviosly easy to slit up
aaime: it's in gt2
groldan: on a well designed test suite, most of them would be integration/system tests
aaime: So, I'm wondering how you make mock testing
aaime: for such a system (which is really about integration)
aaime: without having to rewrite half of geoserver in mock classes
jdeolive: aaime: i agree with you, it definitley lends itself more to integation testing
jdeolive: but what did you think about my suggestion?
jdeolive: about grabbing componenets directly from teh container and unit testing them?
aaime: That's not scrit unit testing either
aaime:
aaime: Once you integrated them with other stuff thru dependency injection
aaime: you're doing a more limited version of integration testing no?
aaime: To make real unit testing one would have to take a class
aaime: and mock off all collaborators
jdeolive: hmmm... not sure i agree... i guess it depends on what the definition of unit test is
jdeolive: fair enough
aaime: Single class testing in isolation afaik
groldan: yup
aaime: everything beyond taht is integration testing
jdeolive: that is what we acheive here... except that spring is doing the mock up
groldan: which doesnt mean you can't use the junit framework to write integration tests, actually that's clever
jdeolive: with the real components... so yeah, i can see the point
aaime: Hum, may I be plain and a little silly?
groldan: not everything, with integration testing you ensure your classes play nice together
aaime: Up until a few months ago we did not have any kind of testing beyond cite
groldan: then comes functional/system testing
funky_c: aaime: I can in halfway through, but would like to chime in: Do we currently have mocked up tests running in JUnit?
groldan: where you assert the system does the right thing
funky_c: s/can/came
groldan: (before you were asserting the components do things right, i.e. as you expect)
aaime: All tests are written using junit
funky_c: Are you guys using any mocking framework?
aaime: Partially, but
jdeolive: somewhat
aaime: since everything was written before testing was available
aaime: making it really unit testable would be a massive work
aaime: we would have to build tens or hundreds of mock facilities
jdeolive: well lets be realistic
groldan: more or less
jdeolive: would actually meeting the definition of "unit testing" be of benefit
jdeolive: no
groldan: first, you have to decide which stuff deserves the effort
jdeolive: would actually testing components in isolation be beneficial... yes
vheurteaux ha abbandonato la stanza (quit: ).
funky_c: What mocking framework are you guys currently using?
funky_c: My impression of mocking was it was easier than that.
groldan: i.e. DefaultJAIMapProducer does, dumb beans dont
jdeolive: funky_c: we use mockobjects for mocking up http servlet requestsw
jdeolive: and then just some spring support classes for creating the application context
groldan: but yeah andrea, my question was more on the line of if we want to take the time to plan on splitting up test suites on unit/integration/functional/performance
groldan: and to decide what to do about it
aaime: What is the benefit of the split?
aaime: at the moment we have unit and integration, I don't see how reorganizing package would improve things
aaime: functional is cite tests
aaime: performance is not there
groldan: well, I can think of a couple, but surely literature will bring more
groldan: yes, functional is cite
groldan: I guess we wanted to augment that though?
aaime: if you look at our tests now
groldan: like we have more stuff than what cite tests for
aaime: many are really equivalents of cite calls
aaime: so in fact we also have functional in there
jdeolive: i would like to see a split between "mock" (integreation) and "psuedo unit" (unit)
aaime: Base class wise it seems to make sense
aaime: not sure we want phisical separation of them?
groldan: then, on the line of performance there are load, stress, etc, they somehow overlap, but not at all
aaime: can we close the unit/integration/functional part before talking about performance?
groldan: thought we were talking about testing
groldan: if we don't want perf test its ok (actually its not)
aaime: Sigh, I want to split them because performance does not have established tools and it's a much longer topic
aaime: And to reduce confusion
groldan: ok
aaime: So far I've seen generic statements about what is what
groldan: true
aaime: but I still dont' see any concrete proposals?
aaime: What do you want to do with that classification?
groldan: that's because my question was to know if we would be up to the task,then I'd like to make a proposal
aaime: To decide if we're up to the task we need to ask if Chris gives us funding to do whatever it is you're proposing
aaime: So better have the proposal first?
jdeolive: well i dont think its too much work to get the split
groldan: but as said, would first need to learn more on the topics, as each kind of test has its purpose and since they're all already (quite) well defined in standard computer science literature it does not make sense to reinvent the wheel nor to miss use the concepts
jdeolive: if you agree that "unit" testing with the container doing the wiring of components is still useful unit testing
aaime: groldan, I would like to see a proposal that goes beyond the academic distinctions
groldan: it _is_ useful, so far is the only kind we have
aaime: that is, somethign that talks about what you want to do
vheurteaux [n=vheurtea@anonymised.com] è entrato nella stanza.
jdeolive: so question
jdeolive: what more use would a collection of mock geoserver objects be for unit testing?
groldan: like the properties datastore "mocking up" was an attempt to isolate stuff too, relying on it should function well as to build confidence on our tests
groldan: jdeolive: note I din't propose to write a huge collection of mock objects
jdeolive: sorry, i was not implying that you did
aaime: groldan, so far I miss entirely what you're proposing. I only see theoretical discussion
jdeolive: maybe i am missing something
groldan: I'd like to find a balance between effort and reward though
aaime: can we put our feet back onto the ground and try to undrestand what we're talking about?
groldan: aaime: so far its theoretical discussion
groldan: imho, we're talking about if we agree we need better tests, and would like to have a proposal for it
aaime: groldan, I know I want more tests
aaime: I know I want them to run faster
aaime: I know that I want perf tests too
aaime: but I dont' undrestand what these "better tests" are
groldan: cool, that's something i'm finding useful doing these separation of concerns
groldan: like to continously be able of running unit tests in just seconds
groldan: and run integration/system say before commit
groldan: etc
SE-Wilco2: Are these self-run tests, or tests run by a script which is separate from a GeoServer server?
aaime: achieving the first geal seems to be a lot of work
jdeolive: actually it might not be
aaime: SE-Wilco2, so far it's part of the build process
groldan: say "better tests" are those that are: easier to write, easier to maintain, that provides a specific kind of ensurance instead of mixing stuff,
jdeolive: although i posted my idea on email and that does nto seem to be what you guys are talkign about
aaime: jdeolive, I have nothing against that idea
aaime: seems a good one to me
aaime: but I feel what gabriel is trying to propose is different
aaime: but the fact he's starting from academic definitions worries me because so far my expience with academic stuff is that it's better leave it in textbooks
groldan: the email idea sounds good for the short term, and its acutally more to the point (pragmatic) than what I'm saying
groldan: as what I'm saying is more a long term goal
aaime: because reality tends to me more complex and forcing in the simplified vision of the world gets in the way
groldan: lol
aaime: Anyways I would be hapy to be proven wrong
aaime: I just need more of layman description to really understand what we're talking about
aaime: what are the resources required
aaime: what are the advantages
aaime: I can't say if I would like that or not if I cannot do a cost/benefit analisys
groldan: agreed
aaime: I'm very pragmatic, making effort to just have something that looks better is not my thing
aaime: (looking better in this case means adhering to whatever theoretical definitions we want to adhere to)
groldan: so much that me asking this is to see if we agree even on investing the time on doing a deeper "testing requirements" analisys
aaime: I think the topic is interesting
groldan: andrea, the whole point beyond the theoretical definitions is to get benefit of a better separation of concerns
aaime: and I would love to participate in it if it requires say 2 hours a week
aaime: other than that it has to be done on paid time
aaime: groldan, that's where reality kicks in
aaime: you can make these kind of separations if your system is built to allow them
aaime: but geoserver was built with zero testing around
groldan: yeah, though reality sometimes means things are so messed up that you have to do huge refactorings just to get code more test friendly
aaime: so functional testing is obvious, integration is hard, unit is close to impossible
groldan: if that the case, doing so is yet another benefit imho
aaime: right groldan, that means someone has to pay for doing that job
aaime: The bottom line of my point of view is that I don't own my working time
groldan: yup
aaime: my employer does
aaime: so whatever requires more than a few hours per weekend is bound to be moved to work time and be
aaime: subjected to a cost/benefit analissys that you can defend with your employer
groldan: always had that in mind, yes
groldan: so much that its like three years since we're talking about improving this kind of stuff
***groldan hopes jdeolive remembers those talks
jdeolive: i do
aaime: groldan, which means reality proved that you can only move by little steps at a time
jdeolive: but we have gone miles ahead of where we were three years ago just by adding one type of testing
jdeolive: imho anyways
groldan: so a bunch of the work was done by justin
groldan: already with his improvements
groldan: yeah
aaime: agreed, testing in geoserver is a massive improvement imho
groldan: cool, things are clear I guess
groldan: we should move
jdeolive: and i think that we can go a bit further with gabriels suggestion
jdeolive: actually more than a bit
groldan: I'll try to get the time to go further on the practical stuff from the theoretical limbo
aaime: Agreed, but again, I would like to see what's the proped next step we can fit between the other higher priorty stuff we're working on
aaime: jdeolive suggestion seems a nice one
aaime: providing a step in the direction of dividing up unit and integration
groldan: if we ever have a plan we can do it a step at a time while doing higher priority stuff I think it would be cool
jdeolive: i think that gives us the benefit of what gabriel brought up... without a ton of cost
groldan: yeah, sounds reasonable
aaime: Ok, so we move this part of the topic on mail and discuss with Gabriel the next steps?
groldan: I'd also take some time to see where the easymock way of mocking up stuff could save time
SE-Wilco2: Remember to reset Topic at end of meeting.
aaime: Sure, I'm curious to see it in action in GeoServer or GeoTools as well
aaime: Do you still want to talk briefly about perf testing?
groldan: hmmm sure, though not sure I have much to say
aaime: I do
groldan: cool
aaime: perf testing is a problematic topic
dwinslow: aaime has something to say about performance?
aaime: stuff like junitperf, testing that something executes within a given amount of time
aaime: does not seem very useful
aaime: the time is machine dependent
aaime: and the real fact is
jdeolive: aaime: i actually had another idea about the use of junitperf
groldan: agreed
aaime: we're not interested in seeing if an onobxious deadline is not meet
aaime: we want to see how the speed compares to yesterda
aaime: one month ago
aaime: one year ago
aaime: so it's more a perf reporting with warnings
groldan: didn't we have some JMeter project?
arneke: i've given it a bit more thought... how about we stop the usual stuff on Artois at night,
aaime: it's not the king of build that you want to fail
SE-Wilco2: When testing "within a given amount of time" consider using a ratio to compare to another test. Seeing if test B runs in about twice the time as test A (if that is expected).
arneke: and have a fast machine run tilecache seeding against it?
aaime: arneke, to have reliable comparison the machine should be completely isolated
SE-Wilco2: Consider whether performance is a "test" (pass/fail) or only a "measurement" (how good it is).
aaime: SE-Wilco, hard to do that one too since different test exercise different parts of the computer
aaime: (disks, memory, cpu, graphics card)
aaime: Eclipse has a good perf testing approach
aaime: one that I like at least
jdeolive: aaime: do you know what tools they use?
aaime: they make graphs comparing current and previous releases
sbenthall: (dwinslow, arneke: artois is not responding)
dwinslow: sbenthall: I'll take a look
aaime: jdolive, no, I think it's home grown thought
aaime: groldan, I've found jmeter good for perf assesment and interactive measurement
dwinslow: sbenthall: seems to be hung on some request
aaime: but not very good for automated perf testing
SE-Wilco2: For perf testing, start by just making it take measurements and not pass/fail (except if the result is tested rather than only the time).
jdeolive: which is where my idea comes in
sbenthall: dwinslow: orly?
***jdeolive will wait until the floor settles
arneke: dwinslow, sbenthall: lets go to #topp
aaime: jdeolive, I'm done
jdeolive: cool
jdeolive: so
jdeolive: here it is, and first let me ask you a quetsion
jdeolive: have you looked at the junitperf api?
aaime: just at a tutorial
jdeolive: ok... so am i correct in thinking that it runs as a regular unit test?
groldan: same here, and got the same concern than andrea
jdeolive: like you could run it in maven?
groldan: yes, they're decorators
aaime: yes, you're correct afaik
groldan: that measure the time a unit test takes
groldan: where that unit test should be an integration test at least
jdeolive: ok... i imagine there has to be something in the api that measures how long th etest took right? and a way to capture it?
aaime: probably
groldan: that's why because it doesnt make a lot of sense for geoserver
jdeolive: so i dont think the value is to ensure that somethine took less then x
aaime: it depends on wheter they exposed it or not
jdeolive: just that it took x
jdeolive: and to save x to a report for publishing
groldan: its ok if you need to ensure a given algorithm performs well though
aaime: jdeolive, I see your point, seems a good idea
aaime: we set up a real data dir for testing various configurations
jdeolive: we would basically use it as a way to easily get timing utilities into the tests
aaime: and then run perf tests on it and report out the resutls
jdeolive: exactly
aaime: it would probably work a lot better than jmeter
jdeolive: confluence has some graphiing macros we might be able to use as well
groldan: cocern is how much we're testing geotools datastore performance or geoserver stuff
jdeolive: groldan: that is the point
aaime: groldan, I think that's like the rest of testing
jdeolive: the main goal (in my mind) is that we track when something thats committed makes the system (etire system) slower
jdeolive: that includes geotools
aaime: our first concern is not what part is fast or slow, but whetever geoserver as a whole gets faster
aaime: (or slower)
groldan: hard to do though
groldan: say I mess up arcsde performance
groldan: who will notice until saul complains about his production server?
aaime: you can catch it if you have an arcsde related test in your suite
sfarber: don't do that gabriel!
aaime: if you don't pity
groldan: will keep trying..
aaime: but what can we do about it?
groldan: I have an idea
aaime: if we don't have a arcsde to be used only for perf tesing
jdeolive: groldan: we need a server farm, with different datastore backends with geoserver
aaime: in a controlled enviroment
jdeolive: and run the suite against all them
aaime: right
jdeolive: we can also charge people a 2 dollar entry, 1 dollar for kids
groldan: do what justin suggests, but find a way to test "how much it takes to encode GML"
groldan: and "how much it takes to encode GML from postgis"
groldan: and "how much it takes form shapefile"
aaime: Well, if you take our existing functional testing approach
aaime: and measure its time against different datastores
aaime: that's exaclty what you get
groldan: except for the pure encoding without taking into acount the datastore?
jdeolive: again i think we need to think baby steps here guys
jdeolive: having any performance testing will be better then nothing
groldan: say encoding is O(n^3) (hope its not)
jdeolive: worrying about measuring individual parts of the system might just be overkill and not that useful in the grand scheme
groldan: it _might_ be useful when you notice a slowdown?
jdeolive: agreed
aaime: How is it likely
jdeolive: but so might the commit log
groldan: like to go straight to the datastore or not
aaime: that you change both the datastore and the encoder at the same time
groldan: ha, true
aaime: for more than one datastore?
groldan: ok, I see we can keep like this forever
aaime: It's just a problem of being pragmatic
aaime: trying to make use of the scarce resources you have
groldan: how do you know that much of my economy?
aaime: In the last year I've have had very rarely time for anything other than creating new features, paid work, bug fixing
aaime: lol
aaime: so for experience I think whatever sounds like infrastructure gets low priority
aaime: and so it has to be split into small steps
***jdeolive aggrees with aaime there for sure
aaime: so that you can take each one at a time
aaime: and each one provides some benefit
groldan: hey, I agree too
jdeolive: changes in infrastructure are also usually met with adversity
groldan: so jmeter is not up to the (automated) task
groldan: but it is for running manual
aaime: yes
aaime: That's why when you say we need a way to measure gml encoding alone
groldan: so we can include that in the release proces
aaime: we say "hold on the horsers"
aaime: because it would require more work
groldan: at least have a per release report
fatgoose [n=fg@anonymised.com] è entrato nella stanza.
aaime: Again, why don't we start with junitperf and see if we need more?
aaime: catching slowdowns as they happen is important
jdeolive: agreed
jdeolive: i think that should be a first step... which is definitley attainable with out a ton of work
aaime: knowing that in the last three months perf went down 50% is not as useful
groldan: as long as its done on that isolated and not-doing-anything-else machine
jdeolive: groldan, agreed
aaime: right, that's what we're trying to get from topp
jdeolive: if we actually has reliable servers all these ideas might actually seem easier
aaime: an old pc that does nothing but perf tests
SE-Wilco2: Consider running tests twice back-to-back, expecting second one to run faster due to caching.
aaime: or an hours of quit on an otherwise
aaime: (of quiet)
groldan: yeah I usuallu discard the first run
aaime: I always do as well
funky_c: Quick plug: Someone I respect in the performance testing world just released a new perf testing book: http://www.testingreflections.com/node/view/6345 It's available for free as a PDF, and may give good insight to later discussions.
sigq: Title: Performance Testing Guidance for Web Applications book | testingReflections.com (at www.testingreflections.com)
groldan: thanks funky_c
SE-Wilco2: Actually, I meant run each individual test twice in a row, not the entire script. So as to check expected caching effects.
***dwinslow bookmarks
sfarber ha abbandonato la stanza.
aaime: So, to recap, we'll have a look at junitperf and see if we can use it as a perf reporting engine
aaime: (or else we can build our own decorators should not be that difficult?)