Domino on Linux/Unix, Troubleshooting, Best Practices, Tips and more ...

Notes 10 - Stay away from older XTAF Dictionary Pack for spell check

Daniel Nashed  16 October 2018 23:05:12

If you try to install the separate available dictionaries which are only available in a 9.0 version ("IBM Notes XTAF Dictionaries V9.0 for Windows Multilingual (CIF0EML)"), your Notes 10 client will hand in the splash screen!
This appears not to be a new issue, but it did not impact most of use because we mostly installed the international versions including the MUI.

We either installed the local language version or a MUI pack which always contained the right dictionaries.


Until the G1 Language versions ship we would need the so called XTAF dictionaries which are available as a separate download.

But there are no dictionaries that work with Notes 10 yet. In fact there are no XTAF dictionaries available working with he current 9.0.1 FPs either - known issue.


I asked in the Beta forum and I opened a support ticket. But the best answer I got was from Christian Henseler finally.
He explained all the details and backgrounds and possible work-around until we get matching XTAF dictionaries for Notes 10.


In short: We should wait until we get the right version which will work with Notes 10!


In long beside the information above Christian also explained to me that there would be a way to extract the XTAF dictionaries that comes with the 9.0.1 MUI pack and add them to the install package of Notes 10.


Or with some extra work build an add-on install package like the dictionary package from IBM.
From Christians research the XTAF dictionaries in 9.0.1 and 10.0 have the same version, so it sounds like a safe way.


But from his analysis there seems to be a difference for the .dic files still used by the basic client. They have double the size for example for canadian.dic.
So I really think it makes sense to wait until they MUI packs and the separate dictionaries ship before end of the year.


This post is intended to provide background and also give you a heads up to stay away from the old 9.0 dictionaries!


Thanks again Christian for your as always detailed analysis!


And I hope we can safe you some trouble if you are looking into this.
I know at least 3 friends who tried to install it today...


Daniel

nshDelLog - Domino Deletion Log Annotation and Backup V 0.9.0

Daniel Nashed  16 October 2018 19:34:57
Finally I have a first version of my deletion log application.
It took a while because I wanted to add some flexibility and make it configurable.

I am not officially releasing it yet to everyone because I want some feedback and some beta testing.

If you want a pre-release version let me know.

Here is the short documentation that I wrote.
This version only works with Domino V10 GA version not the beta version any more because the length of the custom value has been added as extra columns.
So make sure that you remove your deletion-logs from beta testing :-)

-- Daniel


nshDelLog - Domino Deletion Log Annotation and Backup V 0.9.0
Copyright 2018, Nash!Com - Daniel Nashed Communication Systems


Short Documentation Nash!Com Deletion Log Application

Quick Start
  • Copy template to server ensure proper template ACL (default uses LocalDomainAdmins)
  • Sign Template --> there is a Sign DB Action for current user.id or server.id via AdminP (needs unrestricted agent permissions)
  • Create database from template --> suggested default name: nshdellog.nsf but you can chose any name
  • Enable Agent via Config Profile (Status enables the agent)
  • Review Standard Settings


Introduction

Deletion Logging is a new feature introduced in Domino 10 to track deletion of documents and design elements.
All different types of deletions are stored in a central log file "delete.log" in the IBM_TECHNICAL_SUPPORT directory of your server.

This logging is implemented on lower database level and is activated per database. The settings are stored inside the database and do replicate.


Enable Deletion Logging per Database

Right now the only official way to enable deletion logging is to use the compact servertask with the -dl option (see examples below).
You can add up to 4 additional fields per database that you want to log. Those fields could differ depending on the type of database and the logging purpose.
The log does distinct between HARD deletes and SOFT deletes and also allows you to trace the deletion of design elements.

The compact operation looks like this (example):
load compact mail/nsh.nsf -dl on "$TITLE, Form, Subject, DeliveredDate, PostedDate, FullName"

Tip: You can specify more than 4 fields but only the first 4 items found are logged.


Log file Location and Content

After you enabled the deletion logging, each kind of delete is recorded in a central text file on the server.  IBM has chosen a text file for performance reasons.
Here is how the standard log looks like and how it looks with the custom log fields.

In my example I am interested in the Subject, DeliveredDate. And also the Form of the document.
In addition for design elements I am interested in the design element name stored in $TITLE.

Those would be the type of fields I would add for example for a mail-database. The choice of $TITLE for design document delete can be quite helpful. The note class alone might not be sufficient to identify the design element deleted.

The resulting log files are stored in IBM_TECHNICAL_SUPPORT directory and look like this:

"20181016T140343,47+02","del.nsf","C125831B:004BC903","nserver","CN=Daniel Nashed/O=NashCom/CÞ","HARD","0001","08B31722:E41F1971C1258323:0068EF49","Form","4","Memo","Subject","19","Test UNDO  Updated2"

The name of the file has the following syntax:

delete__yyyy_mm_dd@hh_mm_ss.log

Like other files the currently active log file has a default name delete.log and the first line contains the name of the file for renaming it when the server is restarted (similar to console.log as shown above).

Here is a short description of the columns. The last columns depend on your configuration and the list is comma separated and has quotes around the values. Quotes are escaped accordingly.

  • Timedate in the timezone of the server plus the timezone of the server at the end.
  • Database Name
  • Replica ID
  • Process which deleted the note
  • User who deleted the note
  • HARD/SOFT Delete, RESTORE for a SoftDelete
  • NoteClass of the document (for example 1 for a document, 8 for a view/folder)
  • UNID of the document

Custom Log Fields

After those standard fields you see the up to 4 custom fields that you have optionally specified with compact.
The first column always gives you the name of the field. The second column the length of the value. And the the following column gives you the value itself.
The text elements and the total log line is limited. The current limits are  400 bytes per item and 4K for the whole log line.
This should be sufficient in most cases because we only need it to find the document.

The field types that can be used are Text, Text_List, RFC822_Text, or Time.
And those fields have to be present at the database when it is enabled!

The current log file encoding is in LMBCS (Lotus Multi Byte charset) which is the internal representation that Notes/Domino uses since day one to store text.
But it would be difficult outside Notes do read this charset encoding. There are plans for the next update to support other formats. But for now it is LMBCS.


Delete Log Application

Functionality

This deletion log application has the following main functionality:
  • Deletion Log Annotation
    Periodically read the deletion logs on a server and annotate them into a Notes database on the server
    This import collects information from the current delete.log file and also recent delete log files renamed after a server restart
  • Manual Import of Delete Log Files
    You can also manually import log files into a local or server based log database for annotation by simply selecting the log files.
  • Central Backup of finalized Delete Log Files
    Collect completed delete log files and stored them on a centrally located Notes database as an attachment for archiving those log files
    Once the deletion log file is saved to the database those log files are cleaned up on disk.


Installation Instructions
  • Copy the template to your server
  • The Config Profile also contains an action menu to sign a database with your current user.id or with the right server.id via Adminp.
    In addition there is another button to check the status of the AdminP Request. It will open the first response document of the Adminp request when the Adminp request is already executed.
  • Create a new database from template.
  • Ensure the agent can be executed by properly signing the application with an ID that has unrestricted agent permissions.
  • By default the periodical agent is scheduled every 5 minutes on all servers ( Run on * ) and does not need to be configured.
  • The agent is disabled by default. It will be enabled when you set the status in the Config Profile to "on".

Deployment Recommendations

You should install a separate copy with different replicas on different servers.
In addition you could have a central log database for archived delete log files.
The default location for those backup log files is the current database.
But you can change the location in the configuration profile depending on your infrastructure requirements.


Housekeeping for the Delete Logging Database

Deletion logging can generate a lot of log documents. You should think about how you remove those documents after a while.
This can be implemented setting a cut-off delete interval for the database (e.g. 30 days)
You could still manually import backup log files later on in case you need to analyze older log data.
The config profile contains settings to specify cut-off interval and also the cut-off delete flag


Implementation

The application is written in Lotus Script and consists of the following main components
  • One script lib with the main logic
  • One agent which runs periodically on a server to annotate and collect the log files
  • One agent which allows manual log file annotation
  • Configuration Profile
  • Forms and Views to show the log data

On purpose there is not a navigator for the views to allow that you can easily add new views as needed for your evaluations without dealing with the navigator.
The agent runs periodically on the server to annotate the current delete.log file and also to annotate and backup older log files.

For the current log file "delete.log" the annotation is incremental. The delete.log file is read and annotated and the last position is stored in notes.ini
The notes.ini setting has the following format: "$NSHDELLOG_DELETE_" + log file name and stores the last position.

Example: $NSHDELLOG_DELETE_DOM-ONE_2018_10_06@12_58_30.LOG=18977

The name is taken from the first log line which already contains the right name in the delete.log file to allow one final processing when the log file has been renamed after restart.
After reading a renamed log file the log is centrally archived, deleted from disk and the notes.ini entry will be removed.

Entries that generate an error during parsing are stored in separate documents listed in a separate view.
The application also has separate views for error log information for run-time errors.
And also run-time information like bytes and lines read (mainly to see if the application also works with larger amount of data).


Configuration

You find the configuration in the Config Profile with the following configuration Options.

  • Status :  On|Off
    Enables the annotation and collection of logs on the server.
    When you save the profile it will automatically enable/disable the server scheduled agent.
  • Log Backup: On|Off

    Enables Collection of finished delete log files (all files looking like delete_*.log)
  • Log Level: Disable|Enable|Verbose|Debug

    Enables logging
  • Import Charset: (default LMBC)

    The currently used charset is LMBC. It might change in future. This setting changes the charset for the import file.
  • Log Backup Server:

    Server location for log backup files
  • Log Backup Database:

    Database location for log backup files
  • Remove Log after Backup: On|Off

    Determines if finished delete log files are removed from server once they have stored in backup database
  • CutOff Interval: (default: 30)

    Cutoff-Interval set for the standard log database (current database not the dynamic formula specified databases in the Advanced Configuration)

    CuttOff Delete: On|Off (default: On)

    Enable Cutoff-Delete for the standard log database

    Those settings are automatically updated when the Config Profile is saved

Advanced Configuration

If you have special logging requirements you can use the advanced configuration and specify a formula for the log database.
The result of the formula will be used as database name for dynamic logging.

If the database does not exist, it will be dynamically generated from template. You can specify the location of the template.
You could use different databases for different processes or create a new log database every month (see example below).
Each database can have a different cutoff-interval. The replica cutoff-settings are checked and set every time the agent starts.

Using multiple databases should not lead to a performance impact because the database handles are cached inside the application.

The result of the fomula is computed on the log document before saved is used as follows:

""  (empty string)
Log to current database
This is also the default of the log database cannot be created

"DISCARD"

Log entry will not be written.

Local Database Name string

This string is used as  database name. You can specify the name, title, cutoff-interval and also if documents should be cut-off after a certain time
The format used a "|" as the delimiter. You can use any type of formula which could check any type of filed.

DatabaseName|Title|Cut-OffInterval|CutOffDelete

Example:
datestr := @text(@year(@today))+"_"+@text(@month(@today));  "nshdellog_"+ datestr+ ".nsf" + "|Delete Log " + @text(@year(@today)) +" / " + @text(@month(@today)) + "|30|1";

Result:
nshdellog_2018_10.nsf|Delete Log 2018 / 10|30|1



Change History


V 0.9.0 / 16.10.2018

Initial Beta Version



Notes&Domino 10 removed the NSD Service

Daniel Nashed  11 October 2018 18:31:19
For some technical reasons which should be explained in a planned technote shortly the NSD service has been removed from the Notes 10 client and Domino 10 Server.
So this is a planned change and not an install error.  Some of the warning messages on the server side are still mentioning the NSD service but it has been removed on purpose.


Running NSD as a service has been introduced to allow NSD to collect information which only the system account or an administrator could normally collect.

Also depending on the processes are started, administration rights are needed to attach to the Domino server processes and to later on kill the processes if needed.


On the server side this isn't really a relevant change for you because you usually run your Domino service (nservice.exe) with the system account.

In this case fault recovery starts nsd.exe with system account rights which provides full access to processes.


Running Manual NSD on Server


If you need to collect a manual NSD you have to open the cmd prompt with Administration rights. Or you need to start NSD from the running server (which is not always possible in hang situations).


So on the server side when running the server with system account there isn't a big change.


NSD Client


On the client side having NSD service was more important -- specially on newer Windows versions and on Citrix installations.

I have done some tests on my local client and beside the error messages attaching to other processes NSD was still able to get call-stack data from the Notes processes and also memcheck data.

We have to see how it behaves on environments like Citrix.



Here are the WARNING messages on client and server.
On the client the statements are correct. On the server they are a bit misleading.


I hope this explains what is going on. There will be a technote describing some more background hopefully soon.

Update 12.10.2018:


A new technote has released with an official status. IBM is recommending to deinstall to NSD service in 9.0.1 as well.

https://www.ibm.com/support/docview.wss?uid=ibm10734889


The technote also includes a link to another TN with information about deinstallation.


-- Daniel



-- Client NSD --


ERROR (0): AdjustTokenPrivileges failed - (1300) Not all privileges or groups referenced are assigned to the caller.

WARNING (0): NSD is unable to obtain privileges for some debugging operations.  

If you are running as a limited or restricted user then NSD will be unable to obtain some of the information it is attempting to collect.  

NSD will also produce error messages when privileged operations fail.  

However, useful data such as Notes/Domino callstacks will still be collected.



-- Server NSD --


WARNING (0): The NSD service is required on this operating system and must be installed and started

          to enable NSD processing. Because of this requirement, the current NSD log

          may contain errors, warnings and missing data. See nsd -help for more

          information about installing NSD service using -svcinst and -svcstart options.

    Domino V10 - HTTP Requests and REST Services from Lotus Script

    Daniel Nashed  11 October 2018 00:55:16

    One of the long missed features in Lotus Script is to work with HTTP requests.
    Since Notes/Domino V10 you can now use HTTP requests directly from Lotus Script -- For example to query data from a website or from a REST service.

    In the beta I played around with it already. I was specially interested in authentication and HTTPS.

    HTTPS now works and authentication can be implemented on your own. The a bit tricky part is the Base64 routine that you need for the authorization header. But there is a way to leverage the MIME classes for that.

    So the following example helps you to get started. It builds the authentication header and also uses HTTPs to request a website.
    Tip: Depending on the request you might run into issues with too many redirects which the function does not follow automatically. So you have to increase the limit as shown in the example.

    For HTTPS the certificate needs to be verified. I have tested with my Let's Encrypt Certificated and it worked well.

    Another tip: If you run into issues with certificates or other parts of the NotesHTTPRequest, there is a debug notes.ini setting Debug_NotesHTTPRequest=1.

    I have tested the following example with the Notes V10 GA client.
    The Designer help has some additional information also for the other functions of that class.

    Enjoy

    -- Daniel


    Option Declare

    Sub Initialize
            Dim Session As New NotesSession        
            Dim ret As String
            Dim URL As String
            Dim headers As Variant
            Dim user As String
            Dim password As String
            Dim webRequest As NotesHTTPRequest
            Set webRequest = session.createhttprequest()

            user = "john@acme.com"
            password = "mypassword"
            webRequest.maxredirects5
            URL = "https://www.acme.com"
           
            Call webRequest.Setheaderfield("Authorization", "Basic " + EncodeBase64 (user + ":" + password))
           
            ret  = webrequest.Get(URL)
           
            headers = webRequest.GetResponseHeaders()
           
            ForAll h In headers
                    MessageBox h
            End ForAll
           
            MessageBox ret
    End Sub

    Function EncodeBase64 (StrIn As String) As String
            Dim session As New NotesSession
            Dim stream As NotesStream
            Dim db As NotesDatabase
            Dim doc As NotesDocument
            Dim body As NotesMIMEEntity
     

            Set stream = session.CreateStream
            Call stream.WriteText (StrIn)
           
            Set db = session.CurrentDatabase
            Set doc = db.CreateDocument
            Set body  = doc.CreateMIMEEntity
           
            Call body.SetContentFromText (stream, "", ENC_NONE)
            Call body.EncodeContent (ENC_BASE64)
           
            EncodeBase64 = body.ContentAsText
           
            Call stream.Close
            Set doc = Nothing
    End Function

    IBM Domino V10 Product Family Delivery Milestones

    Daniel Nashed  10 October 2018 22:20:47
    I got questions from multiple customers today and there is a blog post from IBM Germany answering all of them and more. Let me translate. Here is a quick summary in English..

    For German check the original post-->  https://dnug.de/ibm-domino-10-die-meilensteine/

    - Shipped as of today 10.10.2018:

    Domino V10.0, Notes V10.0, Administrator Client, Domino Designer & IBM Traveler for Windows, Linux and AIX in English.
    For Linux we get CentOS 7.4 and higher support and Domino will support REHL 7.4 / SLES 12 and higher.

    - Group 1 Languages (German, French, Japanese and Chinese) will follow in the 10.1 release planned for end of 2018.
    - Language versions for Group 2 and 3 will follow in 2019.

    - iSeries / System i will also follow 2019 because it's a complete different platform and there is additional work to do (I recall from earlier days that they had help from a dedicated iSeries team).
     The planning on IBM i-Series is to ship Domino, Traveler, Sametime, Verse on-Premises (VOP) and afterwards the Domino App Dev Pack

    - The current Sametime version is 9.0.1 FP1.
    - There is a roadmap for the new Sametime 10 in 2019. This includes persistent chat and multi-client chat support (you can be logged into multiple clients)
    - Notes V10 customers (customers on maintenance) have mobile Sametime support included in their Notes V10 client license!

    - Connections Profil- and Files-Plug-ins are updated to support V10 and you can find them on DeveloperWorks.

    -  IBM Verse on prem (VOP) has it's own agile roadmap. The 1.0.0.5 release is planned for end of October and IBM/HCL plans to ship an update very quarter.

    - The Verse mobile app has ongoing improvements and will continue to ship via the app stores

    - There is a beta for the Domino AppDev Packs which can be tested with Domino V10 GA --> See details here https://ibm.biz/V10AppDevPack

    - The new mobile Notes Application support the iPad via native iOS app will be available in beta starting end of October!
    - I have seen the app already and it works like a charm! It will be included in the Notes V10 license at no extra cost when you are on maintenance!

    - The new Mac V10 client is planned for end of this year. It will have the same functionality as the Windows Notes client and will support MacOS 10.14 (Mojave).
    - Both versions include Sametime 9.0.1 FP1
    - Stay tuned for a beta for the Notes 10 Mac client!

    Domino Applications on Cloud (DAC) Version 10 support is planned within the next 30 days. For changes in the context of this migration you should contact your DAC service team


    For beta announcements and other details you should check the Destination Domino website --> https://www.ibm.com/collaboration/ibm-domino

    I hope that gives you a quick summary. I had the same questions..

    Beside this I am currently planning my V10 deployment. Mailfile and Notes production client is already updated.
    Also my beta environment. My production Domino and Traveler Linux environment has to wait for the CentOS 7.5 update.
    But I am currently already deploying the new system templates and switch my admin server to a Windows machine in the same Domain.

    -- Daniel


     

    Domino V10 is available for download on Passport Advantage

    Daniel Nashed  10 October 2018 11:30:32
    Just a quick note and the most important part numbers.

    Here is the entry point --> https://www.ibm.com/software/howtobuy/passportadvantage/paocustomer


    And here are the part numbers, descriptions and SHA1 hashes...

    Downloading & Installing right now ..


    The Mac client is not yet available. From what I understood it will be available to the beta community first and will be shipped soon.


    This is the first release. They are planning a V 10.0.1 until end of the year and a V 10.1 version for next year.

    And in 10-12 month we will get Domino V11. Now that Domino V10 is available we can start to look into the released version and you will see more blog posts.


    By the way ..There will be a DNUG Domino Day 15.11.2018 in Düsseldorf where we will cover all the new stuff with best practices and what's new as well.

    The Agenda is already finalized. But we have to add the speakers for the Domino V11 session and the first German JAM Event for Domino V11.

    Barry Rosen (HCL) & Thomas Hampel (IBM) are already confirmed for those two slots.


    -->
    https://dnug.de/event/domino-day-2018/

    -- Daniel
    Part number Description SHA1 Hash
    CNW1REN IBM Notes 10.0 Basic Configuration for Windows English AF837AB75687094FB4F5AB123A993776F8A7C588
    CNW1SEN IBM Notes 10.0 for Windows English 7D27743F55595609EBFC6EF702E33EED25494BA1
    CNW1WEN IBM Notes, Domino Designer and Admin Client 10.0 for Windows English 8062E6AE78E51A84941777309E320EF14B62F469




    Part number Description SHA1 Hash
    CNW1XEN IBM Domino 10.0 64 bit for Windows English 924619957E6D25CBDD767C4E85D2C934E95C3466
    CNW1YEN IBM Domino 10.0 64 bit for AIX English 4FBB33E7A6D0D4333C3AE184918418CC2ACD667C
    CNW1ZEN IBM Domino 10.0 64 bit for Linux English 39CBF3E4195CF0F777CC4919CAFB7C51A9AAD6A7







    Part number Description SHA1 Hash
    CNW23EN IBM Traveler v10.0 for Windows English AC6B1ED3F6E9785516C1892F0AA96A5366548A07
    CNW24EN IBM Traveler v10.0 for Linux English B5F1DD362BFE4F987E6F0B6D30F6166FD46C1383
    CNW25EN IBM Traveler v10.0 for AIX Englishb BDAFF332EFFF8499D596F8AEA5D2139525F76F08


    Domino V10 Deletion Logging explained

    Daniel Nashed  7 October 2018 11:46:33
    During the beta I have played around with deletion logging, gave feedback and had some questions.
    Now shortly before the Domino V10 Launch event I think it is time to share some more detailed information.
    As far I understood there are no changes planned between Beta 2 and GA.


    Deletion Logging is implemented on lower database level and is activated per database. The settings are stored inside the database and do replicate.


    Enable Deletion Logging per Database


    Right now the only official way to enable deletion logging is to use the compact servertask with the -dl option (see examples below).


    You can add up to 4 additional fields per database that you want to log. Those fields could be different depending on the type of database and the logging purpose.

    The log does distinct between HARD deletes and SOFT deletes and also allows you to trace the deletion of design elements.


    The compact operation looks like this (example):


    load compact mail/nsh.nsf -dl on "$TITLE, Form, Subject, DeliveredDate"



    After you enabled the deletion logging, each kind of delete is recorded in a central text file on the server.

    IBM has chosen a text file for performance reasons. I would have wished they write this information into the deletion stub itself.


    Here is how the standard log looks like and how it looks with the custom log fields.


    In my example I am interested in the Subject, DeliveredDate. And also the Form of the document.

    In addition for design elements I am interested in the design element name stored in $TITLE.


    Those would be the type of fields I would add for example for a mail-database. The choice of $TITLE for design document delete can be quite helpful. The note class alone might not be sufficient to identify the design element deleted.


    The resulting log files are stored in IBM_TECHNICAL_SUPPORT directory and look like this.


    "20181006T130419,29+02","del.nsf","C125831B:004BC903","nserver","CN=Daniel Nashed/O=NashCom/CÞ","HARD","0008","45DF55C1:15FE5439C125831E:003CB1F4","$TITLE","TestFolder"


    "20181006T130241,00+02","del.nsf","C125831B:004BC903","nserver","CN=Daniel Nashed/O=NashCom/CÞ","HARD","0001","92A7E32E:6A7A0FD9C125831B:0069F662","Form","Memo","Subject","Test Mail äöü  / 电脑死机"


    The name of the file has the following syntax:


    delete__yyyy_mm_dd@hh_mm_ss.log


    Like other files the currently active log file has a default name delete.log and the first line contains the name of the file for renaming it when the server is restarted (similar to console.log as shown above).


    Here is a short description of the columns. The last columns depend on your configuration and the list is comma separated and has quotes around the values. Quotes are escaped accordingly.


    - Timedate in the timezone of the server plus the timezone of the server at the end.

    - Database Name

    - Replica ID

    - Process which deleted the note

    - User who deleted the note

    - HARD/SOFT Delete, RESTORE for a SoftDelete

    - NoteClass of the document (for example 1 for a document, 8 for a view/folder)

    - UNID of the document


    After those standard fields you see the up to 4 custom fields that you have optionally specified with compact.

    The first column always gives you the name of the field. The following column gives you the value.

    The text elements and the total log line is limited. The current limits are  400 bytes per item and 4K for the whole log line.
    This should be sufficient in most cases because we only need it to find the document.


    The field types that can be used are Text, Text_List, RFC822_Text, or Time.

    And those fields have to be present at the database when it is enabled!


    The current log file encoding is in LMBCS (Lotus Multi Byte charset) which is the internal representation that Notes/Domino uses since day one to store text. But it would be difficult outside Notes do read this charset encoding.

    There are plans for the next update to support other formats. But for now it is LMBCS.


    Delete Log Application


    All those files are located on the server and are not really admin friendly to read. So I am currently building a small application which imports the log files into a server database.

    The same application also can collect the final log-files from each server and stores it a global database to archive those logs before deleting the physical file on disk.


    So on the one side you will have a analysis database with all the logs in sortable and searchable way. And in addition you have full access to all the delete-log files in a central place.

    The application has a scheduled agent which checks the current log file and also existing log files for importing and archives the older log files.


    I have a first version implemented and I am still working on it. But I am planning to publish this database soon.


    If someone wants to beta test the application, drop me a note. And I will also have it with me at the Domino V10 launch event next week in Frankfurt and also at the DNUG bleed yellow event the evening before.


    Daniel



    Countdown Domino V10 - Be prepared and join us for the Bleed Yellow party

    Daniel Nashed  30 September 2018 09:58:54

    Being part of the beta program I have been working with Notes/Domino 10 for a while.

    As one of the lucky holders of the golden tickets invited to the HCL Factory tour in July, I had be quite early access and also actively worked with the later public beta giving feedback.

    But also for me the Notes/Domino V10 Launch in October will probably bring some surprises!!


    I am sure you heard of the launch event in Frankfurt 09.10.2018 (see
    https://www.ibm.com/collaboration/ibm-domino).
    The first part of the event will be llive streamed from Frankfurt. But there is more planned for the day during the afternoon.

    The evening before there is a DNUG Bleed Yellow launch party in Frankfurt where you are all also invited!
    https://dnug.de/bleed-yellow-party/
    As an IBM champion and active DNUG member I will be at both events. Haha  I am already thinking about the yellow dress code for the evening event ;-)


    And I will start upgrading my first productions servers once it is officially  released because I am always on the yellow bleeding edge.


    Be prepared on Linux side


    But I had also a discussion with a customer last week. They want to immediately install Domino 10 for some of their servers because of the changed database limits (256 GB maximum database size, ID table size improvements) for their archive databases where they need to be able to extract data and provide single NSFs without DAOS.


    They are running on Linux and still use RHEL 6.9. As blogged before Domino V10 requires at least RHEL 7.4 and SLES 12 because a newer compiler is used.

    So if you are a Linux customer running older Linux version you already can look into upgrading or installing new servers for the first pilot servers.


    My hosted servers are still on CentOS 6.10. So I will probably move to a new machine on my secondary server first.

    I general this is the way customers chose to update their servers specially in virtualized environments.
    Domino has been always easy in moving from one machine to another machine by reinstalling Domino and just copying over the data (in virtual environments reassinging the virtual disks containing the data).


    My start scripts already support Domino 10 and also the current Linux releases with systemd.
    Moving over to a new Linux version that uses systemd might be the only more complicated part. Systemd is really different. But for Domino my start script will help you.

    Migrating to Domino 10


    Everything else will work in a very similar way than what we saw in previous updates.

    This includes the new ODS 53 which needs the new create_r10_databases=1 notes.ini setting in the same way we had in R9 with ODS 52.

    Applications should work unmodified! This also includes C-API applications.

    I have been testing my C-API based solutions which includes Servertasks like my Database Catalog "nshdbcat" or tools like "nshrun" which allows more efficient database maintenance (compact, fixup, update etc) and also my SpamGeek extension manager application.


    All applications work unchanged compiled with the compilers used for Domino 9 on Win64 and Linux 64.


    There is a new compiler used for Domino V10 and we are waiting for IBM/HCL to release details about official support for C-API based applications.

    But you can expect that existing applications continue to work. This is also what I heard from other partners.

    You still have to check with the vendor of the application to have an official support statement!


    If you did not have a change to look into Domino V10 on your own, there is an interesting presentation from one of the lead developers and messaging architect for Domino.

    Mike presented this session at DNUG conference in June. And I had the great honor to present it again at AdminCamp this month because the HCL development team was in the final preparations for Domino V10.

    Here is a presentation to the original slide deck -->
    https://dnug.app.box.com/v/sessions45/folder/50777353635

    If you have questions post comments or send me emails.

    I am looking forward to see many of you in Frankfurt


    Daniel



    Image:Countdown Domino V10 - Be prepared and join us for the Bleed Yellow party

    Lotus Script Tip: Lists with Tag with large lists and keys

    Daniel Nashed  30 September 2018 07:59:34

    I have been looking for a fast way to store and lookup values in an array/list to do some statistics counting entries with a certain key.
    My first try was to use an array which I redimed once to have a proper size.
    Complile time specified limits are not sufficient high for large arrays in my case.

    The routine was needed to evaluate data from ma SpamGeek application (SMTP extension manager) used by a customer internally to control their SMTP traffic (allow only defined hosts to send SMTP mails).

    It tuned out that using an array was very inefficient from CPU consumption and also run-time.

    One of the customers developers came up with the idea using lists with tags. I did not know the List Tags before which turned out to be very effective.
    A test routine showed great performance up to at least 500000 entries which is already quite high!

    Here are the interesting details of this approach explained with an example. I took the code a wrote for testing and added some additional code an comments for demonstration purposes.

    - You first define a List of the data type you need. In my case a long.
    - Than you can add your elements with the key your are using.
    - To query if an entry already exists there is the Iselement function
    - You can only get elements that are in the list. Else you get a run-time error. So you have to test if it is element first!
    - You can loop thru the whole list using Forall.
    - The Forall variable is your value that your stored in the list
    - The Listatag function is used to return the tag that you used

    The run-time is really impressive! And the missing part that I did not know about was the "List Tag".

    The implementation in the back-end looks highly optimized and there are simple calls to check the list to get elements or to add new elements.

    I highlighted the important parts of the code.

    Maybe this could help you in one of your next projects.

    -- Daniel

            Dim HostList List As Long
            Dim count As Long
            Dim i As Long
            Dim key As String        
           
            '  build a list where the counter is set to the list lable for testing
           
            For i=1 To 100000
                    key = Cstr (i)
                    HostList (key) = i        
            Next
           
            ' check if key exists in list and get entry and label!
           
            key = "100000"
            If  Iselement(HostList (key)) Then
                    count = HostList (key)
                    Messagebox "Found Key: " + key + " with value: "+ Cstr (count)
                    HostList (key) = count+123
            Else
                    Messagebox  "Not found"
            End If
           
           
            ' loop thru list and check if the key is found unsing the Listtag function
           
            Forall h In HostList
                    If (key = Listtag(h)) Then
                            Messagebox "Found Key: " + key + " with new value: "+ Cstr (h)
                    End If
            End Forall


    Support Flash Alert: iOS 12 native Mail app authentication issue with session based authenticaiton

    Daniel Nashed  19 September 2018 07:54:50

    There is a support flash for an issue with iOS 12 with the native mail app.
    Before this change in iOS 12 a wrong configuration did not impact the user for normal operations. There have been only issues when the password was changed.

    Mobile devices cannot handle forms based authentication. If you configure session based authentication or multi server session based authentication the server will not use the basic authentication headers.
    On the other side the recommended authentication on a Domino HTTP server and also on a Traveler server is multi-server session based authentication with LTPA cookies (from security and performance point of view).

    For mobile devices connecting to Traveler you have to ensure basic authentication headers are used because mobile devices do not understand the forms-based authentication for sync requests (they do in the web browsers).

    Enabling basic authentication headers in combination with multi-server session based authentication is only possible if you use the more modern HTTP configuration leveraging "Internet Sites".

    Using an Internet Site you can override session based authentication for the /traveler URL by configuring a Authentication override rule.
    If the server has auto configuration enabled, the required documents will be created automatically if Internet Sites are used for the server.

    So the right configuration would be either with no Internet Sites and basic authentication.

    Or with Multi-Server Session based Authentication and Internet Sites with the Overwrite Authentication rule -- which is the recommended configuration even on a stand-alone Traveler server!

    This isn't a new requirement and the wrong configuration already caused issues when an user's HTTP password changed. In that case the mobile device wasn't able to figure out that the password was wrong.
    The server did send the form with a 200 status code instead of the authentication challenge with a 401. That wasn't understood by the mobile device.
    It worked by coincident because the client sent the basic authentication header anyway.

    Here is an example how your internet site should look like.
    There is one
                    Site name
                    Web Site: Nash!Com Traveler Website (domino.acme.de; 1.2.3.4)
                            Rule (Override Session authentication): /traveler*
                            Rule (substitution):  /Microsoft-Server-ActiveSync* --> /traveler/Microsoft-Server-ActiveSync*
                            Rule (substitution):  /servlet/traveler* --> /traveler*

    There is one additional setting that it required.
    In the internet site you have to ensure that once the user is authenticated with basic authentication for the Traveler URL the user still gets a LTPA cookie:

    When overriding session authentication, generate session cookie: Yes


    Here is the link to the new technote:

    https://www.ibm.com/support/docview.wss?uid=ibm10731987

    Which also contains a link to the documentation how to properly configure the Domino HTTP task on your Traveler Server
    https://www.ibm.com/support/knowledgecenter/SSYRPW_9.0.1/httpauthentication.html

    -- Daniel

    Archives


    • [IBM Lotus Domino]
    • [Domino on Linux]
    • [Nash!Com]
    • [Daniel Nashed]