Declan commented on David's blog post that was a response to an article on JSFCentral .
Still with me?
Anyway he claimed/joked? about the 30 servers needed for the application described on JSFCentral.
That got me thinking, we migrated a pretty complex application from Domino To JEE last year and of course the management thought that things would get a lot simpler/cheaper in the new environment.
Well, if counting servers is an indication...
Only listing servers for the SAAS website (so excluding internal software, webservices and other stuff) we started with the following Linux servers:
2 Apache
1 SQL Logging server
1 Tomcat search server
1 SQL Search server (plus 1 backup)
2 clustered Domino Webservers
1 Domino webserver for 3rd party connections
1 Domino Hub
1 Domino Index
1 Domino ‘Edit’ server (backoffice/maintenance/agents)
1 Windows server for PDF processing (OCR server)
TOTAL: 12
And we now have:
2 Apache
1 SQL Logging server
2 Tomcat Webservers
1 Tomcat Batch server (agents)
3 Elasticsearch search servers
2 clustered SQL servers
1 Windows server for PDF processing (OCR server)
TOTAL: 12
Now as you can see this environment was already partly JEE-iffied, so there were already some additional servers, which taints the comparison. So, if anything the number of servers overall probably increased rather than decreased. Plus a change in the hardware specs of course; used to be 32bit with 8Gb memory and now 64bit with 32Gb. But who's counting... ;)
 24 May 2015  comments (0)
Vince's
Ramblings.
From my inbox:
Vince,
It’s hard to believe that it has been 20 years since THE VIEW first arrived into an exciting Lotus market! During that time we’ve had the pleasure of doing business with thousands of subscribers and attendees of our Admin and Developer conferences. Throughout the years, we worked to deliver trusted and valued information to help you do your job better. We remain humbled by the favorable response to our products over the years.
It's with a sad farewell that we announce we are no longer supporting a subscription model and closing eview.com. As a subscriber, you’ll retain your access to THE VIEW archives through June 30, 2015.
We’re heartened that the community still has a supportive group of content contributors through blogs, forums and user groups. Our parent company, Wellesley Information Services, will continue to manage the SocialBiz User Group, and we encourage you to visit us there, find a local user group, access technical webcasts and ConnectED presentations, and read and contribute technical tips and tricks on the blogs.
Other technical resources we recommend include:
OpenNTF
developerWorks
Notes in 9 (XPages)
Connections101.net
Planet Lotus
VIEW virtual training
The VIEW Anthologies
We hope to see you on SocialBizUG.org and at various IBM events. Until next time...
Regards,
Celia Hamilton
THE VIEW
Shocking news for me, didn't see this coming.
I know the Lotus market 'has been better', but I never considered that an institution like THE VIEW, one of my main sources of information on Lotus matter for the past 18 or so years would stop so abruptly.
All the best to you guys and thanks for everything!
 5 May 2015  comments (1)
Now here is something I never did before.
I have a Neato vacuum cleaning 'robot' and after a year of 'hard work' it started to show errors on the display. The Neato website showed me how to 'reboot' the vacuum cleaner, but there was also a section about software updates.... Software updates for a vacuum cleaner. Quite logical if you think about it, but for some reason it never occurred to me that you could actually upgrade the software of a vacuum cleaner.
So I thought what the heck, just try it, and indeed after 3 attempts I managed to upgrade the software from 3.2.something to 3.4. something. Do I see a difference? Yes, noticed some considerable changes actually. The new software is claimed to have a better battery management system etc, but to me the most notable change is that is a lot more aggressive while vacuuming. It relies less on it's optical sensors and bumps harder and more often into obstacles.
The good news is that you can change the settings by adjusting the software yourself. You may think that as a software developer I will not be able to resist the challenge, but I decided to let the vacuum cleaner's software do it's thing unless it really starts to annoy me.
 6 February 2015  comments (1)
Ok, now this was a very simple problem (in hindsight) but took me quite some time to figure out:
We use Elasticsearch to search through pdf's and the Elasticsearch Attachment Mapper to index the pdf's.
However we have over 500k documents and we noticed that after one or two days Elasticsearch tended to spike the CPU use to 100% due to the attachment mapper. Because even Elasticsearch experts could not find the problem (btw try to find an expert first ;)), we decided to use Tika (the plugin used by the attachment mapper) directly.
It seemed pretty straightforward; added tika-core to the Pom file and change 2 lines of code and away you go....
At least so we thought because all our test code worked flawlessly. However running it in tomcat directly did not extract any text from the pdf's.
After lots of debugging, we noticed that the parser called from the tests were different from the ones invoked by the server. Further investigation revealed that my Java IDE IntelliJ contained the tika-core library as well as the tika-parsers library.
So after checking the documentation (yeah probably a bit late) I found a comment that the tika-core library can identify but not parse the contents of the document.
After exchanging the tika-core with the tika-parsers library in the pom.xml, I got an error that suggested library incompatibilities:
Handler processing failed; nested exception is java.lang.VerifyError: class net.sf.cglib.core.DebuggingClassWriter overrides final method visit.(IILjava/lang/String;Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;)V
This error took some more research but after a lot of coffee and even more strong words I found that there were a lot of libraries in the tika-parsers library that were already in our own pom, but that the culprit was the asm library.
So here is what worked for us, depending on your pom you may need to exclude other libraries as well.
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-parsers</artifactId>
<version>1.7</version>
<exclusions>
<exclusion>
<groupId>org.ow2.asm</groupId>
<artifactId>asm-debug-all</artifactId>
</exclusion>
</exclusions>
</dependency>
 3 February 2015  comments (0)
Now that brought some memories back for me.
A few years after Bruce arranged to host openntf.org at his work, he bought a server and hosted it at PSC in 2002.
After a couple more years OpenNTF really started to attract attention and traffic spiked big time. We started to worry about up-time and data backup so I bought a Dell PowerEdge 1800 in 2006 to act as the European backup and failover server for the American based OpenNTF site.
After the production site was transferred to Prominic (where it still runs today, thanks to Justin Hill and the other guys at Prominic), it continued to act as development server for years, but today sadly it started to squeak and squirrel and after a couple of restart attempts died completely.
Not bad actually; 9 years is way over the average for a server, even for one with every fail safe option available.
Anyway, it got me thinking about the early days of OpenNTF and how simple 'development life' was back then (for me anyway).
I am working on major JEE projects with large multidisciplinar teams nowadays, a long way from the 'one person one project a week' Domino development environments I was involved in those days.
It feels like the death of the old server symbolises the ending of my involvement with the Lotus/IBM/Domino community as well. I am still involved in several Notes/Domino/Traveler/Sametime upgrades and such, but development work on that platform seems hard to find.
Some ISV's are still developing their commercial applications, but I don't see much development at the end-user level. Could be I am not in contact with the current Domino development scene anymore, but that is how i experience it.
Sad because the Domino community seemed so much more involved in the platform than the Java world. Yes there are all kinds of venues and conferences etcetera, but I do not see the same involvement as in the Lotus/Notes/Domino scene.
A sad thought at the start of the Lotusphere Connect ConnectED conference, but I'm sure it will work out eventually.
 25 January 2015  comments (3)
While testing our new system we came across this very strange error message:
org.joda.time.IllegalInstantException: Cannot parse "1940-05-16": Illegal instant due to time zone offset transition (Europe/Amsterdam)
So we checked and it appears that there was a problem with date/times during that date: The time was changed from UTC+00:20 to UTC+1 and joda does not know what to do with that date.
Doesn't really seem like a big deal, a problem with only one date. But since that exact date is the birthdate of Arie Boer, CEO of AHOLD (one of the largest Dutch Supermarket chains) this 'small' problem has a high impact :)
 29 August 2014  comments (0)
We are in the last stages of our Migration project from Domino to JEE and one of the hardest parts was to find a good search engine for full text searching.
We decided on Elasticsearch a very fast and powerful searching engine based on Lucine.
The problem we encountered on exporting the pdf's from Domino on Linux were filenames with diacritics. Domino's embeddedObject.extractFile functionality does not work well with diacritics so I decided to use the InputStream instead:
InputStream inputStream = embeddedObject.getInputStream();
OutputStream outputStream = new FileOutputStream(new File(embeddedObject.getSource()));
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
inputSteam.close();
outputStream.close();
Note: You may have to change the locale settings for the Domino system (or the user Domino runs under) to support UTF-8.
 25 June 2014  comments (0)
An update in the series migration from Domino to JEE . We have all been working very hard on this project, we had some setbacks, lots of personnel changes (good and bad) and lots of stress.
We planned to migrate the old site page per page, but we got to a point where we have to migrate everything related to customer and user settings at once, and that means it will be more like a big bang than an evolution.
Anyway, front-end business; We use Spring with a Thymeleaf frontend because I felt that was easier to develop in than the more traditional JSP. Because we had to start from scratch (the HTML of the current site was originally developed around 2002 with lots of pass-thru HTML computed text) we decided to make it HTML 5 and see how to fix that for IE 8/10 later ;).
Anyway, I thought I had pretty good knowledge of HTML 5 and responsive design, but we got a guy who showed me wrong. He literally started from scratch, all the pages I copied from production and cleaned up were dismissed and replaced by HTML 5 code with under half the lines of code I used. One thing he is very specific about is that the site has to work with NO Javascript at all!
Yes some things like typeahead may not work but the site should function 100% without it. It was a culture shock for me because with all the JQuery and Dojo it is hard to imagine a site without Javascript. But I felt he had a point so I got back to the drawing board and adjusted the back-end to accommodate the new requirements, but you know what; it's not half as hard as it seems using HTML 5.
I must be a real geek because looking at the new HTML pages really makes me happy; so clean, so easy to maintain, so powerful; especially in combination with the Thymeleaf and Spring markup language (SPEL).
 1 April 2014  comments (0)
I'm glad for IBM, but sometimes feel like I'm subsidising all those international giants (Volkswagen, Ikea, Gucci, Pirelli, Prada, Fujisu-Siemens, and U2) with the 52% taxes on my income.
IBM tax rate hits 20-year low with help of Dutch haven
 5 February 2014  comments (1)
Not that one would ever need this, but it works.
And the MBP is not even breaking out in a sweat. Never mind the non-responding, Parallels does not always communicate correctly with the OS.
 30 October 2013  comments (0)
Ok, we had our first release last week, we have finally migrated some parts of a very complex Domino infrastructure to JEE. This is only Phase 1 we are talking about, so lots more to come. But at least we can now get a feeling of how it runs under pressure and whether or not we have to adjust our calculations for the other 5!!! phases of the migration from Domino to JEE project.
So far it looks like we are still within budget. However I get the feeling we are moving more and more functionality forward. All small things mind you, but this could add-up to a significant amount.
The good part (for the project budget) is that even after the last budgeted phase there will still be a Domino server around to take care of those 'loose ends' and 'unimportant' features.
We tried to migrate two weeks ago, we redirected several pages to the new environment, but during the weekend there were so many bugs reported we chickened out and reverted.
There were 2 major problems:
1. We use cookies to authenticate between the two systems and there is a difference in the way cookies are handled by Domino and a servlet container. I was able to fix this, but since we have customers with IP-access accessing the site using their own network as a proxy, it was not possible to test the fix thoroughly.
2. The data on the Domino is system is gathered from several different sources and accumulated over more than 7 years.
We decided to reimport the data from the original sources to the new system (MySQL) to make sure the data is consistent and compleet. As it turned out some of the data is no longer available at the external sources and therefore there was inconsistency in the data between Domino and SQL.
We fixed the issues and are now running the two systems together in production;
In the left corner MySQL, Tomcat and Apache apache servers all running on new hardware with 192Gb of memory for MySQL and a bit less for the Tomcat and Apache servers.
In the right corner Domino running on old hardware with 4Gb memory.
And you know what? The new system is faster than the old one. :P
We still have 'some' issues though, in the old environment we used our own javascript for table creation and sorting. The new environment uses Datatables jQuery plugin and Internet Explorer 8 has huge problems processing more than 2k of data. Yes I know what you are thinking, but we have customers (banks of course) who 'cannot' upgrade, so we have to support that terrible piece of @#$^$%^.
 27 October 2013  comments (0)
I will post a schema of our final setup for the DTAP environment soon. It is pretty slick including Jenkins and Sonar for automated code checking, building and testing projects.
But since I have been temporarily assigned to projects to improve the current (Domino) production environment I don't have much to report on the actual techniques used in the project itself. I do get the progress reports on the projects though and they do not seem very positive. The Spring consultant has been replaced by another, but so far it looks like we still won't make the deadline.
Part of the delay is due to the fact that there was not enough time reserved to maintain the current (Domino) environment and no time at all reserved to accommodate customers requests for reports etc. Another problem is that although this project was meant to recreate the current Domino production environment 'as is' there are still a lot of techniques we take for granted in the Domino world that have to be cleaned up and redesigned before they can be applied to RDBMS systems like MySQL. And that means a lot of extra effort goes into the redesign of code that wasn't originally anticipated.
For example a simple lookup database to store and edit keywords to be used by other databases in the infrastructure can be designed and used in Domino in just a few hours. One creates a form with a key and a field, fill it with the stuff you want, either a single value, multivalue or even richtext values, create a view to lookup the values and away you go. In the JEE world it seems not that simple, at least the Spring experts approach every single lookup as a separate matter. I think there must be a smarter way even in JEE, but well if you hire experts you better listen to them right? ;)
Anyway I ask myself how far we would have gotten with this project if the decision had been made to port the data to MySQL but to use Domino (Xpages) as a frontend instead of (plain old) JEE.
 5 June 2013  comments (0)
What I thought to be a trivial task turned out to be a bit more work than I initially estimated.
There are several posts about this subject including example code, but for different reasons they did not work for me. So I combined a couple of the techniques I found and came up with this:
We have a initiator:
public class ThreadJob {
public static String main() throws InterruptedException, ExecutionException {
ArrayList processItems = new ArrayList(Arrays.asList("item1","item2","item3"));
// init threads
List<Future<ArrayList>> futures = new ArrayList<Future<ArrayList>>(processItems.size());
final ExecutorService service = Executors.newFixedThreadPool(1);
try {
for (String item : processItems) {
Future<ArrayList> list = service.submit(new ThreadCallable(processItem));
futures.add(list);
}
}
for (Future<ArrayList> future : futures) {
long fut = System.nanoTime();
Object returnObject = future.get();
System.out.println("future.get() took: " + (System.nanoTime() - fut) / 1000000 + "ms");
}
System.out.println("Run time: " + (System.nanoTime() - start) / 1000000 + "ms");
} catch (Exception e) {
//Handle error
} finally {
service.shutdown();
}
}
Which calls a 'ThreadCallable' class:
class ThreadCallable implements Callable{
ThreadCallable(String method) {
this.method = method;
public ArrayList call() throws Exception {
//Do your thing
return returnObject;
}
}
}
So why did I choose this instead of the ThreadSessionExecutor<IStatus> which most people seem to use according to Google
First of all I had problems with accessing other classes from the ThreadSessionExecutor.
Besides that the initiation of NotesSession was kind of a hassle. The OpenNTF Threads and jobs project is supposed to fix that, but still gave me problems with calls to external classes and didn't work on ND9 for me.
So starting from the code above I wanted threads to access both SQL and Domino resources simultaneously. This means using external classes and the ability to start NotesSessions from within the ThreadCallable class.
This can be done using the SessionCloner (thanks to Tommy Valland).
class ThreadCallable implements Callable<ArrayList> {
private final String method;
private SessionCloner sessionCloner;
private NSFComponentModule module;
Connection conn = null;
Session session = null;
ThreadCallable(String method) {
this.method= method;
// init mysql connecties
this.dbPool = dbPool.get();
try {
conn = dbPool.getConnection();
} catch (SQLException e) {
e.printStackTrace();
}
//Init Domino connecties
this.module = NotesContext.getCurrent().getModule();
this.sessionCloner = SessionCloner.getSessionCloner();
}
public ArrayList call() throws Exception {
try {
// get NotesSession
NotesContext context = new NotesContext( this.module );
NotesContext.initThread( context );
session = this.sessionCloner.getSession();
// call other classes based on the method
if ("sqlcall1".equals(method) {
returnObject = OtherClass.get(conn,method);
} else {
returnObject = AnotherClass.get(session,method);
}
} catch ( Throwable exception ) {
// handle errors
} finally {
NotesContext.termThread();
try {
this.sessionCloner.recycle();
} catch ( NotesException exception ) {}
}
return returnObject;
}
This is still rough code, I will try to clean it up some more and run some long term endurance tests, but so far it looks like I'm on the right track.
 26 May 2013  comments (1)
As part of the project we need to send both individual e-mails from the website and newsletters using batch processing.
Using Thymeleaf e-mail templates sending Rich HTML emails is very simple. The only thing that was disappointing to me is the handling of inline images. But since the 'old' environment uses internet based images that is not an issue now.
For newsletters and other batch processes we decided to use Spring Batch.... no suprise there :P.
Spring Batch is a highly configurable and also very complicated.. uhm.. comprehensive piece of software to manage batch processes. It does not schedule batch jobs, you can use Springs task executor for that, or Quartz like we do, but it does provide terrific support for handling all kinds of functionalities that you need running batch jobs. Things like chained batch processing, repeat settings, logging, transaction management plus a couple of dozen other functions.
Oh, if you happen to use the latest Spring version (3.2.2.RELEASE), be sure to include spring-context-support in your project. The Spring people moved some Java mail dependencies there.
 13 April 2013  comments (0)
I am afraid there is not much to tell at this point, because I am currently half sidetracked.
First we had a serious problem with our infrastructure and after that the management decided they wanted new functionality in production that could not wait until the new infrastructure was finished.
What I did learn however is that the datatables4j however cool does not offer enough functionality for me so I changed my code to call the underlying jQuery datatables directly. The main issue for me was the lack of jsonp (cross domain) support, but there were more issues that made me 'revert'.
In the meantime the rest of the team is working on creating shared projects with dependencies using pom files, but since I am currently not fully involved I am working on independent projects. One of those involves creating audit trails. The simplest implementation I could find is the hybernate-envers implementation, so I will use that for now.
To be continued..
 1 April 2013  comments (0)
Windows was not a problem, but when I tried to install ND9 on Debian all hell broke loose.
The install itself did work, but the server wouldn't start anymore. After changing some file settings it kept complaining about libnotes.so, yes that old thing again :(.
After several attempts the error message changed from libnotes.so: cannot open shared object file: to ldconfig: /libnotes.so.sym is not an ELF file - it has the wrong magic bytes at the start. You can imagine my frustration when at that point, with my main production server down, my internet connection started failing.
After several attempts to revive my cable modem I had to switch to a dial-in card. Luckily that does not make much difference in speed when you only have command line access :).
Daniel Nashed put me on the right track, for the next bit; append the following line to /etc/ld.so.conf /opt/lotus/notes/latest/linux and do an ldconfig to reload the shared library list gave an error, but somehow fixed the problem anyway.
After that it was time to install Traveler again, but that wasn't easy either; the command line option -Console did not work so I did a -Silent without checking the properties file first. Big mistake, the default Traveler assumes a different install directory than the default one Domino uses :(.
After successful install HTTP still did not run, because of a HTTP JVM: java.lang.reflect.InvocationTargetException.
Checking the error.log, I noticed that the NotesProgram= line in my notes.ini, still pointed to the 8.5 directory.
Changing that still did not fix it, but disabling the XPagesPreload=1 in the notes.ini did.
So finally back online and with a working server. Time for lunch :)
 22 March 2013  comments (4)
Due to Harddisk faillure a customer had to move his Domino webservers to other hardware.
This particular piece of hardware runs on SSD disks and there was only room for one server, so we kept our fingers crossed.
I knew moving to SSD is a performance boost especially for heavy Domino sites, but seeing it in your own production environment is something else!
The one server can easily handle the load that we used to divide among three Domino servers. In fact their site runs 50% faster now.
So if you have a heavy workload on your server and you can afford it, consider switching to SSD.
Don't forget to have a good secondary/backup system in place. SSD are believed to be less reliable and pose an increased risk on data loss.
 19 March 2013  comments (0)
Full Text message: /data/notesdata/news.ft/ftgi/Itemsupp/gtr.SOS Open error. errcode = 3351 errno = 0
Oh well, nothing a recreation of the FT-index can't solve I hope...
 17 March 2013  comments (0)
And after only 3 weeks it is time for the first milestone on Monday; Presenting several related views populated with data imported via scheduled webservices and entered via inline forms.
My experiences so far:
- Spring Tools Suite is a monster of an application development tool. Run it on a VM and it wants 2 cores and consumes all the memory it can find. If you think Domino Designer 8.5 is bad you better leave this one alone.
- Starting a project without first finishing the prerequisites is not good for your time-sheet. (DUH!)
- Switching the base model after a week is neither (double DUH!)
- Having to develop 3 weeks of development in one week is not the best way to start especially whilst learning a new development tool.
- Using 'new' Java projects like Thymeleaf is cool, but it means that at the end of the day the external (Spring) expert will go home with your code and use that for his next project.
- Having no time reserved to find those new cool tools is not so cool, costing me a lot of sleep and probably the weekend :(
- Another cool tool: datatables4j. A great way to add sortable tables to your thymeleaf templates
- In case you haven't figured it out by now; I'm not a fan of projects that start without proper preparation whilst being run by managers who enforce tight schedules
P.S. Development tip: Delete all test classes (JUnit and such) once in a while and check if there are no dependencies on those, because that will screw up your production environment.
 8 March 2013  comments (0)
First an update regarding the Spring project part 1: Modelling bit. As it turns out Spring provides the 'Generic Repository' natively, so with some configuration we can get rid of the bottom class.
Now for the view layer; I looked into the Thymeleaf library and it looks very promising.
Thymeleaf is a Java library. It is an XML / XHTML / HTML5 template engine (extensible to other formats) that can work both in web and non-web environments. It is better suited for serving XHTML/HTML5 at the view layer of web applications, but it can process any XML file even in offline environments.
It uses 'templates' to transform the data into the form you desire, a bit like xslt works, but you can choose almost any output format.
If you choose an html based front-end, you can design the pages off-line, ideal for example when an external webdesigner is responsible for the web-interface. Another advantage is that Thymeleaf allows you to JUnit test the view layer as well. As with all the Java libraries choosing the right target environment java version can be a bit tricky, but I have high hopes I can sort it out.
@Test
public void contact() throws Exception {
mvc.perform(get("/contactlist"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("John")))
.andExpect(content().string(containsString("Jane")));
}
 28 February 2013  comments (2)
This is the first part of the Migration project from Domino to JEE
We started with installing and configuring VM's with our development environments using the Springsource Tool Suite (STS), GIT repositories, VPN etc.
After that we starting on a data model in line with our Code 'bible'
We agreed on loosely coupled modules, but that makes it hard to get classes persistent over all the modules.
For example in Domino we could show data from the database using a single line of code:
@DbLookup("";@DbName();"View";key;"Field");
In Spring we could do the same, there is just one more step involved since the database is outside the framework:
Query query = entityManager.createQuery("select c from ClassName c where Field=:Field").setParameter("Field", key)(jpa implementation)
List<ClassName> list = query.getResultList();
But that would mean that everything is hardcoded, changing this using Model–view–controller (MVC), you get something in the line of:
List <ClassName> list = ClassNameRepository.getClassNamesByKey(key);
class ClassNameRepository() {
public List<ClassName> getClassNamesByKey(String key) {
Query query = entityManager.createQuery("select c from ClassName c where Field = :Field").setParameter("Field", key);
return query.getResultList();
}
}
Loosely coding means that these classes have to be decoupled even further:
ListNote that code above is not pseudo code for demonstration purposes. All the layers in our project are accompanied by interfaces and implementations to decouple the code from the used techniques.list = ClassNameRepository.getClassNames();
class ClassNameRepository() {
public List<ClassName> getClassNames() {
Query query = genericRepository.getQuery("select c from ClassName c where Field = :Field"");
return genericRepository.getQueryResultList(query.setParameter("Field", key));
}
}
class GenericRepository() {
public Query getQuery(String query) {
return entityManager.createQuery(query);
}
public List<T> getQueryResultList(Query query) {
return query.getResultList();
}
}
As you see that requires quite a bit of coding for just a simple query. The advantage is that we can now use the Generic Repository for all our queries and classes and we could easily exchange the JPA based Generic Repository by a Hybernate based repository, while Spring takes care of the coupling.
 22 February 2013  comments (4)
After months of preparation, I will join a small team on Monday, to develop a new infrastructure for a customer currently running an extensive Domino platform.
The goal is to move the data to MySQL and design elements to JEE using Springsource, Hibernate and JPA. The frontend has to be decided upon, but for now we will go with jsp.
The current infrastructure consists of 154 databases with terabytes of data and many connections to 3rd party websites and webservices. To keep the project under control, we will migrate the applications in phases, but since we are talking about a SAAS application, we have to make sure that the migration is seamless, normal operation may not be disturbed.
Our 'bibles' for this project:
Clean Code: A Handbook of Agile Software Craftsmanship
Pro Spring 3
It looks like this will be one of the most challenging projects of the last 10 years, so I hope I can find time to write down my experiences now and again.
Other posts regarding this topic:
part 1: phase 1 first release
part 2: view layer
part 3: first deliverables
part 4: random observations
part 5: batch and email
part 6: progress report on the migration from domino to jee
part 7: phase 1 first release
part 8: Frontend stuff
 14 February 2013  comments (2)
8:00
8:46
 15 January 2013  comments (0)
... but here we go again :P
Upgraded my mac and a Windows 8 VM with Notes 9, and my home (test) server with Domino 9 and traveler.
Let's see if this version deserves a major release number, Instead of 8.5.4.
Mac client:
Admin client
DDE:
 13 December 2012  comments (0)
Took only 3 attempts, and not even because of Parallels, but because of restrictions with the Windows upgrade license.
Anyway, looks good, but takes a lot more space than Win7. Maybe I should consider a bigger flashdrive in my next Macbook....
 4 November 2012  comments (0)
I started on two new projects this week, both in-house.
The first application is a joint venture with Martin Schaefer (Domino designer) and Willem Jan de Jong (graphical designer). My first project with Willem Jan and only saw some icons he did for this project, but I'm already impressed; never knew you can do so much with so little.
The other one is for my wife, she recently started her own business in 'absence management'. I am developing an online system to schedule. track and manage all the actions needed.
Anyway, it's been a looong time since I had the opportunity to develop something from scratch and without any restrictions regarding platform or coding language. I find myself working 12 hours per day just because it is so much fun :)
 3 November 2012  comments (0)
 15 September 2012  comments (0)
If you are used to suspend your VM without closing down the DDE (Lotus Designer), be aware that restoring the VM and (re)accessing a database with changed Java code can invalidate Java code on the server even without saving anything.
Experienced that when I came back from vacation, people were 'not amused' :(
 11 September 2012  comments (1)
I almost bought a new Macbook, but I managed to resist the urge, so only software upgrades now :(.
All ready for a new year of running 70+ hours per week :).
 10 September 2012  comments (0)
I finally rented a server 'in the cloud' because ... well... it's the thing to do nowadays ;)
Let me know if you encounter problems.
 18 April 2012  comments (3)