Planet GRLUG

September 09, 2020

As it were ...

Lone flyer

It used to be I’d see a half dozen planes in the sky at a time.

When 9/11 happened all flying stopped, and meteorological scientists knew they had a very small window to see if planes affected the weather. As it turns out, they did. Not just climate change, but immediate, daily weather change.

I wonder how our weather is different this summer with so few planes flying.

The post Lone flyer appeared first on As it were....

by topher at September 09, 2020 08:50 PM

Whitemice Consulting

Dropping An Element In An Iterative Parse

Using lxml's etree to iteratively parse an XML document and I wanted to drop a specific element from the stream...

        for event, element in etree.iterparse(self.rfile, events=("end",)):
            if (event == 'end') and (element.tag == 'row'):
            elif (event == 'end') and (element.tag == name_of_element_to_drop):
                element.getparent().remove(element) # drop element

The secret sauce is: element.getparent().remove(element)

Document is a "StandardXML" document, like:

       ... elements...

by whitemice at September 09, 2020 06:06 PM

August 14, 2020

As it were ...

Factor Foods: Sun Dried Tomato Chicken

My second Factor meal was sun dried tomato chicken in a cream sauce, with noodled zucchini.

The food prepped on the plate.

There’s not a lot to say about this except it was delicious. The chicken was moist and tender and the zucchini the right amount of soft without being mushy or crunchy.

Cate and I both felt needed salt, but that was easy. Two thumbs up on this one.

The post Factor Foods: Sun Dried Tomato Chicken appeared first on As it were....

by topher at August 14, 2020 09:45 PM

August 13, 2020

As it were ...

Factor Foods Taco Bowl

My wife recently saw an add for a “prepared meal” place called Factor. We read over it and decided to give it a shot. They’re basically high end microwave dinners. They have quite a variety of options for various dietary needs; keto, gluten free, dairy free, etc.

Our purposes were speed of preparation, so we don’t eat out when we’re too busy/lazy/tired to make dinner, and also circumscribed portions. I’ll go back for seconds in a heartbeat for my wife’s cooking, but I’m unlikely to warm an entire new dinner for seconds.

We each got to pick what dishes we want. The one I had last night was called a Taco Bowl. Here it is with just the cardboard sleeve removed.

There are a dozen or so jalapeño slices in the top right, a chipotle yogurt cup in the bottom right, and the main dish on the left.

The instructions were to take the yogurt out, poke holes in the plastic, and microwave for 2 min. Here’s how it looks right out of the microwave:

Taco Bowl right out of the microwave.

The blob in the center concerned me, but when I started mixing I realized it was refried beans. That made me super happy. I don’t like the flavor of jalapeño, and I don’t like yogurt, so that left me with the taco stuff. I also don’t like veggies in my taco, so this was really ideal for me. I put some Taco Bell mild sauce in the meat and stirred and got a small bowl of chips to go along. I also mixed some lemonade and La Croix.

La Croix mixed with lemonade, chips, and taco meat in a bowl


I quite liked it. It’s different from my wife’s, but also quite different from Taco Bell. I was very happy there were no veggies in it, and I suspect even people who like veggies will be happy with that fact. I don’t think lettuce and tomato would last well in this packaging, and they’d be way better added in fresh.


When I was done I felt pleasantly full. I didn’t need to go get something else to eat. I can eat a lot, so that’s not insignificant.


The food containers all came in a box that had been packed with what felt like big plastic bags of ice. They were thick black plastic, much much tougher than a garbage bag, and it felt like there was a large block of ice inside each one, and they were about half thawed.

As it turns out, they’re not water ice, but rather some kind of gel. The gel gets thrown in the trash and the plastic bags are recyclable.

In fact, the only part that wasn’t recyclable was the thin clear plastic over the food. Everything else was cardboard or recyclable plastic.


I liked it quite a lot and I’m looking forward to the next thing. My wife had a roasted chicken last night that I tasted and it was quite good. I’ll blog a review of each food as I go.

The post Factor Foods Taco Bowl appeared first on As it were....

by topher at August 13, 2020 10:57 PM

May 24, 2020

Whitemice Consulting

Installing The Zoom Client On openSUSE 15.1

Uh oh, in a default-ish GNOME install of openSUSE 15.1 there are a couple of unmatched / unclaimed dependencies. It appears Zoom Inc. did not try very hard when drafting the spec for their LINUX clients.

awilliam@linux-tozb:~/Downloads> rpm -Uvh zoom_openSUSE_x86_64.rpm 
warning: zoom_openSUSE_x86_64.rpm: Header V4 RSA/SHA1 Signature, key ID 61a7c71d: NOKEY
error: Failed dependencies: is needed by zoom-5.0.408598.0517_openSUSE-1.x86_64
    ibus-m17n is needed by zoom-5.0.408598.0517_openSUSE-1.x86_64

Let's try the obvious...

awilliam@linux-tozb:~/Downloads> sudo zypper in libxcb-xtest0
Loading repository data...
Reading installed packages...
Resolving package dependencies...

The following NEW package is going to be installed:

1 new package to install.
Overall download size: 17.7 KiB. Already cached: 0 B. After the operation,
additional 10.1 KiB will be used.
Continue? [y/n/v/...? shows all options] (y): y
Retrieving package libxcb-xtest0-1.13-lp151.3.2.x86_64
                                     (1/1),  17.7 KiB ( 10.1 KiB unpacked)
Retrieving: libxcb-xtest0-1.13-lp151.3.2.x86_64.rpm ................[done]

Checking for file conflicts: .......................................[done]
(1/1) Installing: libxcb-xtest0-1.13-lp151.3.2.x86_64 ..............[done]

awilliam@linux-tozb:~/Downloads> sudo zypper in ibus-m17n
Loading repository data...
Reading installed packages...
Resolving package dependencies...

The following 5 NEW packages are going to be installed:
  ibus-m17n libm17n0 libotf0 m17n-db m17n-db-lang

The following recommended package was automatically selected:

5 new packages to install.
Overall download size: 1.6 MiB. Already cached: 0 B. After the operation,
additional 6.9 MiB will be used.
Continue? [y/n/v/...? shows all options] (y): y
Retrieving package libotf0-0.9.13-lp151.2.3.x86_64
                                     (1/5),  47.6 KiB ( 86.3 KiB unpacked)
Retrieving: libotf0-0.9.13-lp151.2.3.x86_64.rpm ....................[done]
Retrieving package m17n-db-1.7.0-lp151.2.1.noarch
                                     (2/5),   1.3 MiB (  6.2 MiB unpacked)
Retrieving: m17n-db-1.7.0-lp151.2.1.noarch.rpm .........[done (7.8 KiB/s)]
Retrieving package m17n-db-lang-1.7.0-lp151.2.1.noarch
                                     (3/5),  17.1 KiB ( 23.0 KiB unpacked)
Retrieving: m17n-db-lang-1.7.0-lp151.2.1.noarch.rpm ................[done]
Retrieving package libm17n0-1.7.0-lp151.2.3.x86_64
                                     (4/5), 240.8 KiB (596.5 KiB unpacked)
Retrieving: libm17n0-1.7.0-lp151.2.3.x86_64.rpm ....................[done]
Retrieving package ibus-m17n-1.3.4-lp151.2.4.x86_64
                                     (5/5),  31.6 KiB ( 69.8 KiB unpacked)
Retrieving: ibus-m17n-1.3.4-lp151.2.4.x86_64.rpm ...................[done]

Checking for file conflicts: .......................................[done]
(1/5) Installing: libotf0-0.9.13-lp151.2.3.x86_64 ..................[done]
(2/5) Installing: m17n-db-1.7.0-lp151.2.1.noarch ...................[done]
(3/5) Installing: m17n-db-lang-1.7.0-lp151.2.1.noarch ..............[done]
(4/5) Installing: libm17n0-1.7.0-lp151.2.3.x86_64 ..................[done]
(5/5) Installing: ibus-m17n-1.3.4-lp151.2.4.x86_64 .................[done]

And what happens now?

awilliam@linux-tozb:~/Downloads> rpm -Uvh zoom_openSUSE_x86_64.rpm 
warning: zoom_openSUSE_x86_64.rpm: Header V4 RSA/SHA1 Signature, key ID 61a7c71d: NOKEY
error: can't create transaction lock on /usr/lib/sysimage/rpm/.rpm.lock (Permission denied)
awilliam@linux-tozb:~/Downloads> sudo rpm -Uvh zoom_openSUSE_x86_64.rpm 
warning: zoom_openSUSE_x86_64.rpm: Header V4 RSA/SHA1 Signature, key ID 61a7c71d: NOKEY
Preparing...                                                            (1################################# [100%]
Updating / installing...
   1:zoom-5.0.408598.0517_openSUSE-1                                    ( ################################# [100%]
run post install script, action is 1...

Installed; and it works.

by whitemice at May 24, 2020 06:52 PM

April 22, 2020

OpenGroupware (Legacy and Coils)

Code Example: Adding A User To A Team

You have a user who's login is "fred" and you wish to add them to the team name "User Support".

>>> from coils.core import initialize_COILS, AdministrativeContext
>>> initialize_COILS()
>>> ctx = AdministrativeContext()
>>> a = ctx.r_c('account::get', login='cleslie')
>>> a
<Contact objectId="9144860" version="617" displayName="Smith, Fred", login="fred" UID="9144860@27fd7d5-0c5e-4074-b2f0-7470a8sssdc9-815912229"/>
>>> t = ctx.r_c('team::get', name='User Support')
>>> t
<Team objectId=9154000 version=14 name="User Support">
>>> ctx.r_c('account::join-team', account=a, team=t)
  account::join-team 0.00363612174988s
>>> ctx.commit()
>>> ctx.r_c('team::get-logins', team=t)
  team::get-logins 0.0082790851593s
[u'george', u'fred', u'stanley']

The "r_c" method of the Context object is an abbreviated call to the "run_command" method.

by whitemice at April 22, 2020 06:58 PM

April 20, 2020

Whitemice Consulting

gEdit's Amazing External Tools

In a few recent conversations I have become aware of an unawareness - an unawareness of the awesome that is gedit's best feature: External Tools. External Tools allow you to effortlessly link the power of the shell, Python, or whatever into an otherwise already excellent text editor yielding maximum awesome. External Tools, unlike some similar features in many IDEs is drop-dead simple to use - you do not need to go somewhere and edit files, etc... you can create and use them without ever leaving the gedit UI.

Plugins tab of the Preferences dialog.

To enable External Tools [which is a plugin - as is nearly every feature in gedit] go to the Plugin tab of Preferences dialog and check the box for "External Tools". External Tools is now active. Close the dialog and proceed in defining the tools useful to you.

With External Tools enabled there will be a "Manage External Tools..." option in the global menu. When in the Tools menu not there is also an "External Tools" submenu - every external tool you define will be available in the menu, automatically. The list of defined tools in that submenu will also include whatever hot-key you may have bound to the tool - as you likely will not remember at first.

Manage External Tools Dialog

Within the Manage External Tools dialog you can start defining what tools are useful to you. For myself the most useful feature is the ability to perform in-place transformations of the current document; to accomplish this set input to "Current Document" and Output to "Replace Current Document". With that Input & Output the current document is streamed to your defined tool as standard input and the standard output from the tool replaces the document. Don't worry - Undo [Ctrl-Z] still works if your tool did not do what you desired.

What are some useful External Tools? That depends on what type of files and data you deal with on a regular basis. I have previously written a post about turning a list of value into an set format - that is useful for cut-n-paste into either an SQL tool [for use as an IN clause] or into a Python editor [for x=set(....)]. That provides a simple way to take perhaps hundreds of rows and get them into data very simply.

Otherwise some tools I find useful are:

Format JSON to be nicely indented

python -m json.tool

Use input/output settings to replace current document.

Open a terminal in the directory of the document

gnome-terminal --working-directory=$GEDIT_CURRENT_DOCUMENT_DIR &amp;

Set the input/ouput for this action to "Nothing"

Remove leading spaces from lines

sed 's/^[[:blank:]]*//'

Use input/output settings to replace current document.

Remove trailing spaces from lines

sed 's/[[:blank:]]*$//'

Use input/output settings to replace current document.

Keep only unique lines of the file

sort | uniq

Use input/output settings to replace current document.

Format an XML file with nice indentation

xmllint --format - -

Use input/output settings to replace current document.

IN Clause Generator

This takes a document with one value per line and converts it to an SQL like IN clause. The output is also appropriate for creating Python set values.

#!/usr/bin/env python
import sys

iteration = 0
line_length = 0
text = sys.stdin.readline()
while (text !=  ''):
  text = text.strip()
  if (len(text) > 0):
    if (iteration == 0):
      sys.stdout.write(', ') 
    if (line_length > 74):
      sys.stdout.write('\n ')
      line_length = 0
    if (len(text) > 0):
    line_length = line_length + len(text) + 4
    iteration = iteration + 1
  text = sys.stdin.readline()

Input is "Current document", output is "Replace current document".


by whitemice at April 20, 2020 05:21 PM

April 11, 2020

As it were ...

Cambridge Local Shops – Help During Covid-19

A couple weeks ago a friend of mine named Elisabeth Klaar contacted me to let me know she was working on a cool new project. She lives in Cambridge, England and many of the businesses local to her are either partly or fully closed during the pandemic. She started a business listing site that lists all the local businesses and describes what they’re doing during the pandemic to stay in business.

It’s called Cambridge Local Shops – Help During Covid-19.

Some of the businesses are offering delivery, some are offering special hours for the elderly, and others are offering a variety of things.

At the time there was a lot of manual moving of data, and I saw some places I could help streamline, so we worked together to make a better system. This post will include all the code we wrote in easy-to-use plugins, themes, and import files, instructions on how to use them, and some discount codes for some of the plugins we used.

The idea is that YOU can build this same thing just by installing what we built for you.

Software Overview

The highlight is that we have a custom post type to hold the businesses with Advanced Custom Fields adding some custom fields to it. We’re using WSForm with the Post Management Addon to make a form for stores to submit their own information as Draft businesses. The theme is a child theme of GeneratePress with some custom templates and CSS to render the businesses nicely.

Getting The Resources

GeneratePress – That link will get you a 20% discount if you’re building a project like this. The paid version is not required, but Tom is a great guy making a living building great software. Support him if you can.

Cambridge Child Theme – This contains the CSS and template files. It’s on github, so click the green button in the top right of the page and download the zip and install it on your site.

WSForm – That link goes to the paid version, which gets you the Post Management Plugin. Use the coupon code STAYINSIDE to get 20% off to build something like this.

Simple Business Listing plugin – This is a plugin I wrote using code I got from This provides the custom post type. This is also on github, but downloading the zip gets you a normal plugin.

ACF and WSForm import files – Both ACF and WSForm have the ability to import what you need for this project. This file contains them both, simply unzip them on your computer and use the Import function of each one to import the proper file respectively.

What You Get

The Site

You’ll get a Business archive at /businesses like this one. Single businesses will each get their own page with a singular slug like /business/the-essential-soaps/. Here’s an example.

If a business chooses to print its physical location the address is a link to Google Maps. Phone numbers are printed at the discretion of the business.

Inputting Data

You also get a form that businesses can use to submit themselves to the listing, like this one. Submissions are saved as a draft so you don’t get garbage posted to your site. The site owner gets emailed about the submission and can log in and Publish the business if they wish.


There are two taxonomies, one for Business Type and one for Location, so people can sort by interest or location.


Check out and if you think your neighborhood could use something like that try using these tools. If you get stuck leave me a comment and I’ll help if I can.

There’ll be another story about Elisabeth’s experience posted on HeroPress on Wednesday, I’ll link there then.

The post Cambridge Local Shops – Help During Covid-19 appeared first on As it were....

by topher at April 11, 2020 08:20 PM

April 05, 2020

As it were ...

Week Four

Friends of mine are doing a good job of keeping track of how much worse things are now than last week, but I’ve given up. I couldn’t tell you how many are sick or how many have died, but I can tell you the numbers are still climbing in the U.S. I saw an article that said Spain’s number of sick has fallen for 2 days in a row.

The President has said we should keep up social distancing until May. Some states haven’t even started it yet. Delta Airlines emailed me and said that their points programs are essentially on hold until 2021.

Kinsa has made a “Real-time map of influenza-like illness in the United States.”

Arrow pointing at my county in MI.

We’re all still healthy.

We did our second week of church from home. It works pretty well. The pastor sits next to a TV with his slides on it, and occasionally they’ve had people say something on that TV, and they’ve done a good job of piping that audio to the Zoom instance, so it sounds really good instead of just a TV across the room.

Sunday School in Zoom is interesting because people can put things in the chat while the speaker is talking, without it being an aural disruption.

I know that Michigan is hit kind of hard, but it seems to be mostly in Detroit. I heard yesterday that 4 people in “Southwest Michigan” died. I have no idea what the Grand Rapids hospitals are like.

I know a couple people who’ve lost their jobs and are concerned about finances, but not many. Economists on TV are talking about some pretty serious doom and gloom.

We haven’t even started recovering yet.

The post Week Four appeared first on As it were....

by topher at April 05, 2020 06:06 PM

March 23, 2020

As it were ...

What can you do?

When I was a senior in high school my dad taught Emergency Medicine to my science class. At the end of the school year we were all certified EMTs. One of the things he taught us is how to announce our abilities. If you come across an accident, you spot the person in charge and go to them and clearly say “I have Emergency Medical training!”

This lets them know that they have a resource available to them, and they have no other way of knowing.

This last Sunday someone on twitter put out a call to action to anyone who works in ecommerce. People from a wide variety of ecommerce companies, companies that are traditionally competitors, all joined a new Slack and built a website full of information to help brick and mortar stores move online in this time of economic struggle.

As is common on many Slack instances, there’s an introduction channel. You join and then the first thing you do is state who you are and what you do. I noticed someone join and then say this:

I can help with everything that touches an ecommerce site, from tech, processes, sales management, operations, warehouse management, etc. I have ran and or built, $2M, $30M and +$500M ecommerce sized operations in B2B and DTC.

I was struck that in almost any other context this would come across as terrible arrogant, but in this case it’s exactly what needed to be said. He didn’t say anything more than that, he simply got to work, and that’s awesome.

He did the equivalent of what my Dad taught us to do. When there’s a disaster, clearly announce what you’re good at, and get to work.

What are you good at?

The post What can you do? appeared first on As it were....

by topher at March 23, 2020 01:44 PM

March 21, 2020

As it were ...

Physical Distancing Week Two

This whole pandemic thing is happening so quickly, and yet it feels like it’s been going on for a really long time. It was only the beginning of this month that we were trying to decide whether to cancel events.

We’re not exactly under quarantine, the Governor has simply asked people to engage in physical distancing, reducing human contact. Four states have now actually required “shelter in place” quarantine, where people aren’t supposed to leave their homes except in case of strong need.

This hasn’t been a huge burden on my family yet, we tend to only go out when we need to anyway, so we’re still doing that. Groceries, etc. We’ve gotten takeout food a couple times, but of course all restaurants have closed their seating areas.

Today is really the first time where we wanted to go out just for fun, and decided not to do that fun thing. Even that was going to a cool grocery store. But we didn’t really need anything, we just wanted the environment, so we stayed home.

Video Chats

Maybe once before in my life have I taken part in a video chat whose purpose was only to chit chat. This week I’ve been involved in four. My daughter has started video chatting with her friends multiple times per day. At work we’re talking about setting up chats within business verticals, to get merchants talking to each other. This feels like a New Thing in society.


This feels like one of the biggest things to me so far, because it’s fundamentally changing the structure of our government. Tax day has moved from 15 April to 15 July. The amount of infrastructure involved in changing that blows my mind. The number of things that count on that tax money hitting at the same time every year is something I can’t even wrap my head around.


My work hasn’t really been impacted, other than my co-workers are now also working from home. Cate’s work seems to be picking up some. Sophia’s work online dropped off. Ema works at a Starbucks and her co-workers have been staying home sick, so her hours have really spiked. She’s working really hard, coming home tired, and trying to keep up. I’m proud of her, but concerned.

Stay tuned for for in the future.

The post Physical Distancing Week Two appeared first on As it were....

by topher at March 21, 2020 05:26 PM

March 15, 2020

As it were ...

Blogging COVID-19

The other day I saw this most excellent tweet:

I blogged a lot when my kids were young because the time was important to me and I wanted to keep track of it all. This is an important time too, and I’m going to try to keep track of it.

The Beginning

I suspect I learned about “the corona virus” about the same time and way as most Americans. It was on the news when Wuhan started getting bad. We watched every morning at it grew and grew. The city shut down, then the province. Then it started getting out of China.

Its first impact on my world was when WordCamp Asia 2020 was canceled in February. I wasn’t planning on going, but many people I know were either going to go or were working on it. Either way, it hit close to home.

Then it started popping up in other places, places I didn’t expect, like Italy. I assumed it would spread on the ground, and take a while.

We started to wonder if perhaps WordCamp Europe in June would be canceled. “Surely not!” we thought. But then it was. Not only that WordCamp Central strongly suggested that ALL WordCamps before 1 June, anywhere in the world, should be canceled. Sure enough, the emails started to roll in.

“It was a tough decision, but WordCamp X has decided to postpone until next year”

The Big Shift

It really started to hit home that things are going to be different now this last week. Here are some things that announced cancellation or long term closure just this week:

  • The NBA
  • The NFL
  • Major League Baseball
  • The NHL
  • March Madness
  • SXSW

And dozens and dozens of other very large things. Google has asked its ENTIRE workforce, globally, to work from home. Shopify has asked its entire workforce to work from home, AND offered $1000 to each employee for home office upgrades. That’s 5 million dollars.

Friday evening we went to the Apple store in the mall. One of the employees was telling he’d heard rumors that the entire mall would be closed.

The next day (yesterday) we went back to return a phone case. The Apple store was closed, but the doors were open. Standing in the doorway were two employees, and there was a taped off area 6 feet around them. They were explaining that the store would be closed for 2 weeks. Return times would be extended to take that into account. As it turns out, Tim Cook decided in the middle of the night Friday night to close all stores, except, ironically, in China, where things are getting better.

The rest of the mall was eerily quiet and nearly empty.

I heard today this could go on for six months. I think a lot of restaurants are going to close permanently. They can’t afford no income for six months.

The Toilet Paper Thing

Someone told me there was a run on toilet paper at a Costco. I laughed. There’s always some region that goes bananas and does something stupid like that. Then another friend said “Yeah, my costco has no toilet paper either!” Then another up north, where people are usually smarter than this said “Yeah, our local target has no toilet paper either!”

I thought “Really? REALLY?”. Then we went to our local Target and I peeked down the toilet paper aisle.

Empty shelves at Target.Really.

Not only that, we went to our small local grocery store. Same thing.

Yesterday things were better and I saw some on shelves, and my daughter was able to get a pack from the grocery store.

Still. Wow.


So far I have no fear at all, for a variety of reasons. One is that I have complete faith in God to handle things. Yes, terrible things are going to happen. If you want to know how I can reconcile that, let me know, I’d love to have a long conversation with you about it.

Another is that my own family are all quite healthy and robust. Even my in-laws in their 70’s are vibrant and strong. If we get this thing it’s going to just as miserable as all the other flu’s we’ve had and we’ll move on.

My only real feelings about the whole thing are both excitement and guilt.

Excitement because this is a momentous, historic, global thing that I’m getting to live through and experience. It’s like going through a huge snow storm.

Guilt because I know this time is going to SUCK for so many people. How dare I be excited for all that misery? But there it is. I watch the news in anticipation, like watching a sporting event. What will happen next?

Your Thoughts

I’d love your comments here, but I really do think we all need to be blogging like crazy right now. Our grand kids will thank us.

The post Blogging COVID-19 appeared first on As it were....

by topher at March 15, 2020 05:48 PM

February 10, 2020

As it were ...

States I’ve Traveled To

This one is going to get updated as I go to new states. This is 35 including Washington D. C. since I think it’s of equal importance.

US Map showing states I've been to.

  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado
  • Delaware
  • Florida
  • Georgia
  • Illinois
  • Indiana
  • Iowa
  • Kansas
  • Kentucky
  • Maryland
  • Massachusetts
  • Michigan
  • Minnesota
  • Missouri
  • Nebraska
  • New Jersey
  • New Mexico
  • New York
  • Ohio
  • Oklahoma
  • Oregon
  • Pennsylvania
  • South Dakota
  • Texas
  • Tennessee
  • Utah
  • Virginia
  • Washington DC
  • West Virginia
  • Wisconsin
  • Wyoming

The post States I’ve Traveled To appeared first on As it were....

by topher at February 10, 2020 08:55 PM

December 28, 2019

As it were ...

New Year’s Resolutions for 2020

I normally don’t do New Year’s resolutions.  If I want to change something I typically mark a start point and go with it. This year it’s kind of coincidence that there are some things I’d like to change this time of year. Here’s what I have going on at the moment.

Use Time More Wisely

This is going to be broken into four specific areas, starting with things I’m going to stop.

1. Less TV

I’m not a huge TV watcher, all things considered. I try hard to avoid shows that I know I’d love and get sucked into. That said, most evenings I find myself and my wife on the couch watching TV. We have some favorite shows that we like to keep up with. But then there are times when we’re caught up, and we flip though LOOKING for something to watch. I want to stop that, and cut back a lot, which will give me more time for the things I DO want to do.

2. Better Home Ownership

We’ve owned a house for 18 years now, and there are some things I haven’t been good about. I’m bad about raking the roof after a snow, which hurts both the roof and the gutters.  I’d like a beautiful lawn, but I’m terrible about mowing and watering. My garage is both dirty and messy. Those things are going to change.

3. Do Tasks More Promptly

This is a small thing, but I think it’s going to pay off big.  I tend to procrastinate with things that don’t seem immediately important. The mail piles up because I know it’s all spam. I’d like to take care of it as soon as it comes in. Two of the last three years the Christmas tree has gone in the trash after August. It wasn’t up in the house, it was laying behind the garage. Two years ago our Halloween pumpkins grew into beautiful pumpkin plants the next spring and grew some really nice pumpkins. When I’m done with a dish I don’t put it in the dishwasher, I put it on the counter right above it. All these, and dozens more things I’d like to be prompt about.

4. Get up at 6:30 am eastern time everyday

My wife has asked me to do this for a couple months for a variety of reasons. She tends to wake up naturally around then anyway, and just waits for me to wake up. But we’d like some more consistency in the daily schedule.  We’ll start going to bed more regularly too.

I also want to keep this same schedule when I’m in Texas for work.  It’s only one hour difference, but it’ll keep my internal clock on the same schedule when I get home.

Become more flexible through stretching

Over the last 10 years or so I’ve become progressively less flexible, mostly because I don’t stretch anything, I sit all day, and I’ve gotten fat. I’m being really transparent here. It’s really hard for me to put on socks because it’s hard for me to reach my feet. If something falls on the floor I dread it, because unless I get on my knees I really struggle to reach the floor. I’m pretty convinced most of this issue is due to the fact that I simply never stretch myself. So I’m going to start stretching. I might get into yoga, but I’m going to start by simply doing the things it’s hard to do, many times a day. I’m hoping to spend 30 min or so every morning (see getting up at 6:30) simply stretching.

Lose weight

I actually started this a couple months ago, and I did quite well for a while.  I didn’t revert, the fasting simply stopped making me lose weight. There was some minor cheating, but not much. I’m going to keep working on fasting, it’s the least painful method I’ve ever used.

Don’t buy alcohol

I drink more than I’m happy with. I don’t feel like I have an “alcohol problem” I feel like I have a “I’m spending too much money on something that makes me fat” problem. My preferred type of drink tends to be stuff like Mike’s Hard Lemonade, which isn’t a lot different from pop. It’s pretty low alcohol, high sugar, and probably the most expensive beverage I could consume. So my plan isn’t to quit alcohol, simply stop buying it.  This allows me one or two free drinks at a WordCamp, enjoying a bottle of something someone gifted us, etc, but since both of those things are relatively rare I don’t think they’ll hurt my waistline any, and they can’t hurt my wallet.

Learn Javascript

I remember when Javascript was invented. The most popular thing to do with it was make little messages appear in the status bar at the bottom of the browser.  Then there were the spam pop-ups. And DHTML (Dynamic HTML). DHTML made it so you could make a little image follow your cursor, and make images change when you put your mouse on them, etc. It was generally accepted that it was worthless at best, dangerous at worst. I chose not to waste my time on it.  I learned CSS and PHP instead.

Well, I think it might be sticking around. When I first got into WordPress, custom post types were new, and no-one really knew how to use them.  This put me at the same level as people who had been using WordPress for years, and my strong PHP skills rocketed me to the forefront of WordPress.

These days Gutenberg is the new hotness, and to get really creative with it you have to know not only javascript, but also React. All of a sudden I feel pretty irrelevant as a WordPress developer. I can’t make any of the cool new things that other people are making. I feel like I’m really starting at ground zero here.  All those people I’ve told over the years “if you take some time and learn it, you can make a living as a WordPress developer”. Now it’s time for me to take some time and learn it.

What are your resolutions?

The post New Year’s Resolutions for 2020 appeared first on As it were....

by topher at December 28, 2019 10:17 PM

November 25, 2019

Whitemice Consulting

Uncoloring ls (2019)

This is an update from "Uncoloring ls" which documents how to disable colored ls output on older systems which define that behavior in a profile.d script.

Some more recent systems load the colorization rules in a more generalized fashion. The load still occurs from a profile.d script, typically ls.bash, but mixed in with other functionality related to customizing the shell.

The newer profile.d script looks first for $HOME/.dir_colors, and if not found looks for /etc/DIR_COLORS.

To disable colorized ls for a specific user create an empty .dir_colors file.

touch $HOME/.dir_colors

Or to disable it for all users make the /etc/DIR_COLORS files not exist.

sudo mv /etc/DIR_COLORS /etc/DIR_COLORS.disabled

by whitemice at November 25, 2019 06:33 PM

November 09, 2019

As it were ...

My Cool Weight Loss Tracker

In my original post about losing weight I posted a chart showing progress. It used a shortcode to pass the data to a bit of javascript from Google. The downer is that I had to edit that post every day and add a day’s data to an increasingly long string.  It didn’t seem to scale.

I thought about some other options.  I was going to make a Settings page with a set of repeater fields, and update that, but I couldn’t find a small library and I didn’t want to write it by hand.

I thought about using Sugar Calendar and simply putting the date on every day. I should mention that I’d like the data to be available in more places than one, so I could make something else with it someplace else, so I was going to make an endpoint on my blog no matter what I chose.

A good talk with Bill Erickson helped me see the light. Google sheets can return a JSON blob if you pass the right URL.  So now the data is stored in Google Sheets, accessible from anywhere, and I update it there rather than updating a single blog post every day.

I’ve also put some stats in the bottom of my sidebar.

I haven’t exactly lost a pound a day since the beginning, WordCamp US set me back quite a bit, but it’s still moving steadily and I am content.

Total Weight Lost: 2 lbs. in 333 days.

The post My Cool Weight Loss Tracker appeared first on As it were....

by topher at November 09, 2019 01:34 AM

October 21, 2019

Whitemice Consulting

PostgreSQL: "UNIX Time" To Date

In some effort to avoid time-zone drama, or perhaps due to fantasies of efficiency, some developer put a date-time field in a PostgreSQL database as an integer; specifically as a UNIX Time value. ¯\_(ツ)_/¯

How to present this as a normal date in a query result?

date_trunc('day', (TIMESTAMP 'epoch' + (j.last_modified * INTERVAL '1 second'))) AS last_action,

This is the start of the epoch plus the value in seconds - UNIX Time - calculated and cast as a non-localized year-month-day value.

Clarification#1: j is the alias of the table in the statement's FROM.

Clarification#2: last_modified is the field which is an integer time value.

by whitemice at October 21, 2019 01:36 PM

October 19, 2019

As it were ...

When It’s Not Imposter Syndrome

I remember the first time I heard about imposter syndrome. It was from Chris Lema. At the time I thought “Huh, that makes sense that people would struggle with that”.

Ironically, I don’t think I’ve ever really struggled with imposter syndrome. I’ve always had a pretty good handle on what I’m good at, and what I can do. When I compare myself with others it’s usually either to figure out how to boost myself to their level, or to simply admire the fact that they’re at a level I’ll never reach. I can be content with that.

There’s always been a thought in the back of my head though, “What if it’s NOT imposter syndrome? What if some people really aren’t Good Enough for the task at hand, and have been lucky? Am I there?”

For a long time I wasn’t there. I was Good Enough for the task at hand. I could objectively say “They asked me to do X, I know how to do X, so I’ll go do it”.

But then one day I wasn’t.

I took a job with a big agency that does some big, hard projects. I knew how to do much of what they asked, and I assumed I’d learn how to do what I didn’t know. As it turned out, they hired me expecting me to already know those things. This was a mistake on both our parts.

If we had had lots of time, and money were not an issue, they could have taught me what I needed to know. But the reason they hired me was to get a specific job done. For that job I wasn’t Good Enough.

This wasn’t immediately apparent. Everyone takes a little time to get into the swing of things at a new job. It’s when that swing doesn’t happen that you start to wonder. It was in my third month on the job that I think everyone realized I wasn’t Good Enough for this job. I wasn’t happy with the work I was doing, and I wasn’t happy with how often I had to ask someone else to stop what they were doing to do what I was supposed to do because I didn’t know how.

My supervisor was sympathetic, but had to do his job, which was properly staff his team. He gave me 2 months to figure things out, but also told me early enough that I could look for a job at WCUS.

In the end I was at the company for only 5 months. They gave me a small severance package, which was very kind of them, they didn’t need to.

I’ve thought a lot about this experience over the last few years. Ironically it still didn’t give me imposter syndrome. I still knew what I was good at, and now I knew something I wasn’t good at.

It’s really really important to remember that just because you’re not good enough for a specific task doesn’t mean you’re not Good Enough. Just because you don’t know something doesn’t mean you’re dumb. Take your experience and learn from it.

Mine started me down the road toward not being a professional web developer anymore. I’ll never stop BEING a web developer, just like a plumber doesn’t stop knowing how to fix a pipe when he retires. But day to day he’s doing something else.

Now I’m doing something else. Something I’m actually better at than web development, and it brings me joy and provides for my family.

I want to summarize by saying that if what you’re doing doesn’t feel right you should think hard about it. Some people will tell you “Oh that’s just imposter syndrome”. And they might very well be right. But look deeply anyway. Find your OWN path.

As a person you’re ALWAYS Good Enough. Whether you’re prepared for the task at hand is another thing entirely.

The post When It’s Not Imposter Syndrome appeared first on As it were....

by topher at October 19, 2019 11:11 PM

October 13, 2019

As it were ...

Time To Change My Eating

I’ve put on weight every month for the last 36 months or so. Every month I’ve weighed the most I’ve ever weighed. I’m getting tired of it. This last Monday I started a “diet”. It’s loosely called intermittent fasting, though that term means different things for different people.  For me it means I only eat during a 4 hour period in a day.  That’s a 20 hour fast. I chose to eat only between 4pm and 8pm.  Practically speaking I generally don’t eat until 5 or 5:30 when my family eats.  I also find myself cheating a little in the evening, and snacking between 8 and 9. But if I start late, I don’t mind ending late.

I haven’t had real hunger during the day yet, it’s been really really comfortable.

I whipped up a WordPress plugin to help me track how I’m doing.  This chart will be updated every day:

Total Weight Lost 0 lbs. in 0 days.

So far I’m losing a pound a day. I have no idea how long this will go on so easily, we’ll see. At this rate I should be down 25lbs by WCUS. We’ll see if anyone notices.

The post Time To Change My Eating appeared first on As it were....

by topher at October 13, 2019 11:10 PM

September 24, 2019

As it were ...

Living Life With Tourette Syndrome

I was 47 years old when I learned I’d had Tourette Syndrome ever since I was about 10 years old.

I’d heard of it of course.  It’s that weird disease that makes you yell swear words at inappropriate times, right? Well, it’s not a disease, and only about 1% of people who display symptoms have the swearing symptom.

How did I find out? I randomly watched a video on YouTube of a comedian who plays off his Tourette’s for his comedy. His name is Samuel J. Comroe, and the longer I watched the more I heard about my own life. Check it out, it’s REALLY funny.

The most common Tourette’s symptoms are tics. It’s like a twitch, but twitches are usually one-offs, single or few instances.  A tic can be something as benign as sniffing a couple times per minute. Or a light cough. My first memory of anything is from when I was about 10 and my mom said one day “What’s with the cough-sniff?” and I said “What are you talking about?”.

She said “Every couple minutes you cough and then sniff”. I said “No I don’t, why would I do that?”. But then I started noticing she was right. There are two kinds of tics in Tourette’s, auditory and muscular. The famous swearing symptom is auditory, but it can be anything.  My first tic combined two auditory tics, and I’ve never had another.

There’s another taxonomy of tics that contains transitory and chronic tics. Mine have been exclusively transitory, though I have one now that I’ve had for years, and I wonder if it will stay. Transitory tics last a few days, weeks, or months, and then fade away. They rarely return, but I’m quite careful not to do them on purpose just to see.

My first really noticeable tic started while I was at camp one summer, so it was a surprise for my family when I came home. You know how you can move your jaw side to side a little bit, and flex the joint, and maybe even pop it like cracking a knuckle?  I started doing that, except also flexing the muscles on my cheek. But only on one side of my face. I sat at dinner the first night home and my dad said “Why are you doing that?!?! You look retarded!”

I need to point our here that my dad was rarely that callous when I was a kid, and I had a good enough relationship with him that I was able to say “I can’t help it, back off!” and he did and I wasn’t scarred by it.

I’d also like to point out here that my dad was a paramedic instructor and my mom was a Registered Nurse, and it never occurred to either one of them in my whole life that I might have an actual neurological disorder to explain this stuff.  My family just said I lived in the Twitchy Zone.  They all came to accept that I had tics.

Over the years I’ve had the ever common shoulder roll a couple times.  We’ve all seen baseball players do it as they come to the mound. I’d just do it every 45 seconds or so for 6 months. (Note, as far as I know, all of my tics have gone to sleep with me at night, I don’t have any tics while sleeping.)  One time after I graduated from college I noticed my forehead muscles ached. Then I realized I had been flexing them every 30 seconds or so for days.  That one lasted just a few weeks. My roommate hated it, he couldn’t understand why  kept doing that when I looked at him.

It’s really hard to cuddle up with my wife and sit still to watch a movie or fireworks or anything. My current tics are small, but she can feel every one of them and it’s really uncomfortable.

My current tics:

  • I move my fingers against each other so they rub, kind of like scratching a slight itch. Many people do this, so unless you watch me long term it’s really hard to notice.
  • I flex the muscles around my ears, forcing my ears back away from my eyes, which pulls my glasses up. Again, glasses wearers will tell you we all do this, but the movement is SO tiny that people don’t usually notice. I just do it every few minutes.
  • My left bicep has had a light tic for a couple years now. It just barely flexes for about a quarter second. Most people don’t notice, but a few people have asked me about it.  I suspect far more notice that say anything. But even that is a small motion, so unless you’re in a conversation with me, or watching closely, you won’t notice.

I’ve always wanted a tic that made my abs flex spontaneously every few seconds, so I could get a free sixpack. Alas.

Tics are often called involuntary, but they’re actually unvoluntary. This means that I can stop a tic any time I want just by thinking about, but the longer I don’t do it, the more mental focus it requires to keep it from happening. After a few minutes, 100% of my focus is on making it not happen, and as soon as I think away, it happens again.

After watching Comroe’s video I just sat in silence for a while, thinking about all the tics over all the years.  I started reading about Tourette’s and found that I fit the symptom profile perfectly for all age groups. Kids are more likely to be vocal.  It’s worse in the teen years (because who doesn’t need to look different as a teen?). It gets less pronounced in the adult years.

I read about other common symptoms, and was astonished to discover I have most of the symptoms of ADHD. Again, all I knew was ADHD was “hyper” and I was never hyper. But boy do I have the actual symptoms. OCD is another common co-symptom, and while mine is pretty specific, I absolutely have it in some places.

There’s isn’t really a treatment. Symptoms are rarely bad enough to change ones capabilities in life. If they make you look or act unusual then you have to get around that, but it’s really not that bad for most people. For a few the tics can be very dramatic, like throwing oneself on the ground, or swinging arms in a wide arc.  Even for those folks the treatment is usually based around hypnosis or something.  Remember the focus thing? That can be exercised and enhanced if you really need to, and it can help a lot of people.

I have been extraordinarily blessed in my life that no-one has ever made me feel bad or teased me about any of this. Kids can be amazingly cruel, and I never got any of that.

It’s really hard to describe how life changing it has been to know what’s been going on all these years.  It’s even weird to say, because my life hasn’t changed. Nothing is any different. But now I know why I was different from the other kids. Why my body does this stuff that I can’t seem to control. There’s a reason, I’m not just randomly out of control of my own body.

I wrote this post so that maybe someone else like me will find it and come to the same understanding. I also hope it’ll help YOU, dear reader, understand what Tourette’s is, and perhaps spread that understanding, so that fewer people make to their fifties before knowing what they’re dealing with.

Here’s some reading material on Tourette Syndrome:

The post Living Life With Tourette Syndrome appeared first on As it were....

by topher at September 24, 2019 03:33 AM

September 11, 2019

Whitemice Consulting

PostgreSQL: Casted Indexes

Dates in databases are a tedious thing. Sometimes a time value is recorded as a timestamp, at other times - probably in most cases - it is recorded as a date. Yet it can be useful to perform date-time queries using a representation of time distinct from what is recorded in the table. For example a database which records timestamps, but I want to look-up records by date.

To this end PostgreSQL supports indexing a table by a cast of a field.

Create A Sample

testing=> CREATE TABLE tstest (id int, ts timestamp);
testing=> INSERT INTO TABLE tstest (1,'2018-09-01 12:30:16');
testing=> INSERT INTO TABLE tstest (1,'2019-09-02 10:30:17');

Create The Index

Now we can use the "::" operator to create an index on the ts field, but as a date rather than a timestamp.

testing=> create index tstest_tstodate on dtest ((ts::date));


Now, will the database use this index? Yes, provided we cast ts as we do in the index.

testing=>SET ENABLE_SEQSCAN=off;
testing=> EXPLAIN SELECT * FROM tstest WHERE ts::date='2019-09-02';
                                 QUERY PLAN                                  
 Index Scan using tsest_tstodate on tstest  (cost=0.13..8.14 rows=1 width=12)
   Index Cond: ((ts)::date = '2019-09-02'::date)
(2 rows)

For demonstration it is necessary to disable sequential scanning, ENABLE_SEQSCAN=off, otherwise with a table this small the PostgreSQL will never use any index.

Casting values in an index can be a significant performance win when you frequently query data in a form differing than its recorded form.


by whitemice at September 11, 2019 03:09 PM

August 30, 2019

Whitemice Consulting

Listing Printer/Device Assignments

The assignment of print queues to device URIs can be listed from a CUPS server using the "-v" option.

The following authenticates to the CUPS server as user adam and lists the queue and device URI relationships.

[user@host ~]# lpstat -U adam -h -v | more
device for brtlm1: lpd://
device for brtlp1: socket://
device for brtlp2: socket://
device for brtmfp1: lpd://
device for btcmfp1: lpd://
device for cenlm1: lpd://
device for cenlp: socket://
device for cenmfp1: ipp://
device for ogo_cs_sales_invoices: cups-to-ogo://attachfs/399999909/${guid}.pdf?mode=file&pa.cupsJobId=${id}&pa.cupsJobUser=${user}&pa.cupsJobTitle=${title}
device for pdf: ipp-to-pdf://smtp

by whitemice at August 30, 2019 07:36 PM

Reprinting Completed Jobs

Listing completed jobs

By default the lpstat command lists the queued/pending jobs on a print queue. However the completed jobs still present on the server can be listed using the "-W completed" option.

For example, to list the completed jobs on the local print server for the queue named "examplep":

[user@host] lpstat -H localhost -W completed examplep
examplep-8821248         ogo             249856   Fri 30 Aug 2019 02:17:14 PM EDT
examplep-8821289         ogo             251904   Fri 30 Aug 2019 02:28:04 PM EDT
examplep-8821290         ogo             253952   Fri 30 Aug 2019 02:28:08 PM EDT
examplep-8821321         ogo             249856   Fri 30 Aug 2019 02:34:48 PM EDT
examplep-8821333         ogo             222208   Fri 30 Aug 2019 02:38:16 PM EDT
examplep-8821337         ogo             249856   Fri 30 Aug 2019 02:38:50 PM EDT
examplep-8821343         ogo             249856   Fri 30 Aug 2019 02:39:31 PM EDT
examplep-8821351         ogo             248832   Fri 30 Aug 2019 02:41:46 PM EDT
examplep-8821465         smagee            1024   Fri 30 Aug 2019 03:06:54 PM EDT
examplep-8821477         smagee          154624   Fri 30 Aug 2019 03:09:38 PM EDT
examplep-8821493         smagee          149504   Fri 30 Aug 2019 03:12:09 PM EDT
examplep-8821505         smagee           27648   Fri 30 Aug 2019 03:12:36 PM EDT
examplep-8821507         ogo             256000   Fri 30 Aug 2019 03:13:26 PM EDT
examplep-8821562         ogo             251904   Fri 30 Aug 2019 03:23:14 PM EDT

Reprinting a completed job

Once the job id is known, the far left column of the the lpstat output, the job can be resubmitted using the lp command.

To reprint the job with the id of "examplep-8821343", simply:

[user@host] lp -i examplep-8821343 -H restart

by whitemice at August 30, 2019 07:29 PM

Create & Deleting CUPs Queues via CLI

Create A Print Queue

[root@host ~]# /usr/sbin/lpadmin -U adam -h -p examplelm1 -E \
  -m "foomatic:HP-LaserJet-laserjet.ppd" -D "Example Pick Ticket Printer"\
   -L "Grand Rapids" -E -v lpd://

This will create a queue named examplelm1 on the host as user adam.

  • "-D" and "-L" specify the printer's description and location, respectively.
  • The "-E" option, which must occur after the "-h" and -p" options instructs CUPS to immediately set the new print queue to enabled and accepting jobs.
  • "-v" option specifies the device URI used to communicate with the actual printer.

The printer driver file "foomatic:HP-LaserJet-laserjet.ppd" must be a PPD file available to the print server. PPD files installed on the server can be listed using the "lpinfo -m" command:

[root@crew ~]# lpinfo -m | more
foomatic:Alps-MD-1000-md2k.ppd Alps MD-1000 Foomatic/md2k
foomatic:Alps-MD-1000-ppmtomd.ppd Alps MD-1000 Foomatic/ppmtomd
foomatic:Alps-MD-1300-md1xMono.ppd Alps MD-1300 Foomatic/md1xMono
foomatic:Alps-MD-1300-md2k.ppd Alps MD-1300 Foomatic/md2k
foomatic:Alps-MD-1300-ppmtomd.ppd Alps MD-1300 Foomatic/ppmtomd

The existence of the new printer can be verified by checking its status:

[root@host ~]# lpq -Pexamplelm1
examplelm1 is ready
no entries

The "-l" options of the lpstat command can be used to interrogate the details of the queue:

[root@host ~]# lpstat -l -pexamplelm1
printer examplelm1 is idle.  enabled since Fri 30 Aug 2019 02:56:11 PM EDT
    Form mounted:
    Content types: any
    Printer types: unknown
    Description: Example Pick Ticket Printer
    Alerts: none
    Location: Grand Rapids
    Connection: direct
    Interface: /etc/cups/ppd/examplelm1.ppd
    On fault: no alert
    After fault: continue
    Users allowed:
    Forms allowed:
    Banner required
    Charset sets:
    Default pitch:
    Default page size:
    Default port settings:

Delete A Print Queue

A print queue can also be deleted using the same lpadmin command used to create the queue.

[root@host ~]# /usr/sbin/lpadmi -U adam -h  -x examplelm1
Password for adam on 
lpadmin: The printer or class was not found.
[root@host ~]# lpq -Pexamplelm1
lpq: Unknown destination "examplelm1"!

Note that deleting the print queue appears to fail; only because the lpadmin command attempts to report the status of the named queue after the operation.

by whitemice at August 30, 2019 07:11 PM

July 25, 2019

Whitemice Consulting

Changing Domain Password

Uh oh, Active Directory password is going to expire!

Ugh, do I need to log into a Windows workstation to change by password?

Nope, it is as easy as:

awilliam@beast01:~> smbpasswd -U DOMAIN/adam  -r
Old SMB password:
New SMB password:
Retype new SMB password:
Password changed for user adam

In this case DOMAIN is the NetBIOS domain name and is the domain's DNS domain. One could also specify a domain controller for -r, however in most cases the bare base domain of an Active Directory backed network will resolve to the active collection of domain controllers.

by whitemice at July 25, 2019 03:29 PM

July 08, 2019

As it were ...

What To Expect When Giving Your First WordCamp Talk

I recently convinced a Very Smart Woman to give her first WordCamp talk. What’s a little unique about this circumstance is that it’s also the first WordCamp she’s ever attended. This means she has no frame of reference for what to expect, so she asked me a bunch of questions. It thought it could be useful to grab that perspective and speak to it for posterity. So here is a random collection of things you should know about your first WordCamp talk.

  • There will be a projector you can plug your computer into and it will display on a big screen.
  • You can use any software you want to make your presentation. Keynote, Google slides, PowerPoint, whatever.
  • You don’t have to have a digital component to your talk if you don’t want to.
  • The projector could have a connection type your laptop doesn’t support. A brand new mac supports only USB-C. Many older projectors have only VGA or HDMI. I recommend investing in a converter that fits your laptop.  So if your laptop does DisplayPort and the projector is HDMI, you might want a converter like this.  That said, the majority of the time there’s a converter onsite for you to borrow, whether from the venue or another speaker. Don’t count on it, buy converters when you can afford them, but don’t avoid speaking just because you don’t have a converter.
  • There will be someone within your view that will hold signs up when you’re near the end of your speaking time. You’ll see a 10 minute sign and a 5 minute sign.  Sometimes this person will also introduce you at the beginning of your talk, sometimes not. It can always be your choice though.
  • You should end your talk about 10 minutes before the deadline so there’s time for questions.
  • If you don’t have time for all the questions, announce that you’ll be at the Happiness Bar right after your talk. The Happiness Bar is a place for people to get help and ask questions.  You can hang out there for however long you want answering questions.
  • People are encouraged to walk out of a talk if they discover it’s not suited to them. Don’t take this personally. If your talk really isn’t for them then they need to not waste that time wishing they were in another talk.
  • If someone asks a “question” that’s “more of a comment really”, feel free to interrupt and tell them this is a time for questions, and they could meet you at the happiness bar later if they want. This is YOUR talk, don’t let someone hijack it and make it into what they think it should be. The same holds true of anyone taking control from you. Be strong. YOU are the expert at the front of the room.
  • At some point in your speaking career someone is going to attend your talk that you think is WAY smarter/more knowledgeable/better coder than you or whatever. Don’t worry about it. They’ll still learn something from you, I promise. They attended because they want to hear what you have to say.
  • There’s often a speaker/sponsor dinner/soiree the night before WordCamp. This is usually similar to the after party, but with FAR fewer people. I strongly recommend you attend. They have experience to share, and soon you will too.
  • Speakers usually get a free ticket to WordCamp, so I recommend not buying one until after you find out if you’ve been accepted.

I can’t think of more right now, but I’m sure there are many. Please leave extra tips in the comments below.

The post What To Expect When Giving Your First WordCamp Talk appeared first on As it were....

by topher at July 08, 2019 02:00 PM

June 06, 2019

As it were ...

WordCamp Detroit 2019

On May 18th Cate and I went to WordCamp Detroit. We both spoke. I talked about trends in ecommerce, and Cate talked about Working in WordPress.  It was a small, one-day event, but it was quite fun, and we got to see some unexpected friends.

The post WordCamp Detroit 2019 appeared first on As it were....

by topher at June 06, 2019 09:13 PM

May 24, 2019

Whitemice Consulting

CRON Jobs Fail To Run w/PAM Error

Added a cron job to a service account's crontab using the standard crontab -e -u ogo command. This server has been chugging away for more than a year, with lots of stuff running within he service account - but nothing via cron.

Subsequently the cron jobs didn't run. :( The error logged in /var/log/cron was:

May 24 14:45:01 purple crond[18909]: (ogo) PAM ERROR (Authentication service cannot retrieve authentication info)

The issue turned out to be that the service account - which is a local account, not something from AD, LDAP, etc... - did not have a corresponding entry in /etc/shaddow. This breaks CentOS7's default PAM stack (specified in /etc/pam.d/crond). The handy utility pwck will fix this issue, after which I the jobs ran without error.

[root@purple ~]# pwck
add user 'ogo' in /etc/shadow? y
pwck: the files have been updated
[root@purple ~]# grep ogo /etc/shadow

by whitemice at May 24, 2019 08:09 PM

April 24, 2019

As it were ...

April 18, 2019

Whitemice Consulting

MySQL: Reporting Size Of All Tables

This is a query to report the number of rows and the estimated size of all the tables in a MySQL database:

  ROUND(((data_length + index_length) / 1024 / 1024), 2) AS mb_size
FROM information_schema.tables
WHERE table_schema = 'maindb;

Results look like:

table_name                                  table_rows mb_size 
------------------------------------------- ---------- ------- 
mageplaza_seodashboard_noroute_report_issue 314314     37.56   
catalog_product_entity_int                  283244     28.92   
catalog_product_entity_varchar              259073     29.84   
amconnector_product_log_details             178848     6.02    
catalog_product_entity_decimal              135936     16.02   
shipperhq_quote_package_items               115552     11.03   
amconnector_product_log                     114400     767.00  
amconnector_productinventory_log_details    114264     3.52    

This is a very useful query as the majority of MySQL applications are poorly designed; they tend not to clean up after themseves.

by whitemice at April 18, 2019 06:30 PM

April 16, 2019

As it were ...

I’m A Travelin’ Man

Starting last fall I’ve been traveling quite a bit more than usual. I had every intention of blogging it all like crazy, but blogging is like any other habit. If you don’t do it it’s not a habit.

So I’m going to make this a roundup post and try to do better in the future.

Sydney, Australia

I started my current job last summer on my birthday. Four days later I was in Sydney. It was a bit of a surprise and very exciting. We have an office there, it was the week of WordCamp, and there were two other conferences happening that week. So I went for a week and had a GREAT time. I got to know some of my new co-workers and interact with some from a previous job.

I took lots of pictures, but here are just a couple.

Omaha, Nebraska

In August Cate and I went to WordCamp Omaha.  She’s always wanted to go, and I enjoyed Oamaha the other time I’d been there, so away we went. We rented a car and drove, which was a good time all by itself. Cate spoke at the camp but I did not.

Here are a couple pictures.

Pittsburgh, PA

In September Cate and I went to WordCamp Pittsburgh. We’d never been to the city and there were some friends there we wanted to see.  We rented a car again and still enjoyed it. Again, more pics:

Philadelphia, PA

In October I went to WordCamp Philadelphia for work.  It was kind of our kickoff event, but the plugin wasn’t QUITE ready, so we just talked a lot. I really enjoyed traveling with co-workers and showing them what WordCamps are like. I got a little HeroPress love while I was there, some fans were excited to meet.

Austin, TX

Since we have an office in Austin I’ve been there four or five times in the last 10 months or so. I won’t talk a lot about it since I’ve already blogged about it.

Dallas, TX

We have a bunch of friends in Dallas as well, and made a bunch of new ones. I cam along with Cate on this one, she spoke again, to great success as always. The most impressive thing about Dallas was the free range beer.

Beer case labeled

Nashville TN, WordCamp US

This one was for work again, but Cate came again of course. Several co-workers from BigCommerce came along as well as a couple representatives from Modern Tribe. We had a great time all around.  Matt stopped by the booth and we chatted, and he even talked about how cool we are on stage at the State Of The Word.

Philly Again

In February the Philadelphia meetup folk asked me to come back and present about BigCommerce. It was a lightning trip, 24 hours on the ground, but hugely successful in my opinion.

Phoenix, AZ

Also in February Cate and I went to WordCamp Phoenix. The weather was great, about 45 degrees and a bit rainy. BigCommerce sponsored and a couple co-workers came with us. I got to speak about HeroPress, Cate was on a panel, and our friend Tracey spoke about ecommerce.

Dayton, OH

In early March Cate and I went to WordCamp Dayton. We had a really good time, but I forgot to take pictures.  🙁

Orlando and Miami, FL

In mid-March I went to Orlando to their WordPress meetup. It went well, and I was able to get some sweet Disney and Potter swag for my family. I was in Orlando for less than 24 hours, and then flew south to Miami for WordCamp. I got there a couple days early and got to sit by the pool in the sun for one whole day. Then it was back to work.  Since I was in town early I was able to help out the organizers with moving camp stuff from a living room into a big truck.

We were sponsors, so we had several people from BigCommerce.  I spoke about ecommerce once and HeroPress in a lighting talk.

Washington, D.C.

When I was done in Miami I flew directly to D.C. I got there one day early and had a chance to look around. I was there for their meetup, which went very well.

Austin and London

At the end of March I went to Austin for a few days in the office, and then flew from there directly to London England. I was there for WordCamp, but got there a week early. We have an office there, so I worked with co-workers for a couple days.  I hung out with a friend one evening and walked around town taking pictures. I spoke at WordCamp and met MANY new friends and talked with many old friends.


That’s it for now.  Detroit is next, with possibly Santa Clarita in there. Berlin is in June. I’ll try to do better at posting once per trip.

The post I’m A Travelin’ Man appeared first on As it were....

by topher at April 16, 2019 11:39 PM

April 09, 2019

OpenGroupware (Legacy and Coils)

Create a Workflow Process via REST (curl)

Creation of a process via an HTTP PUT is essentially the same as creation of a route via a WebDAV client as REST is a subset of WebDAV. The input message payload for the process must be PUT as an object named InputMessage in the Route's container. XATTRs (extended attributes) can be set using URL parameters; the ability to set XATTR values is an advantage REST has over most WebDAV clients.

Here is an example of create a process instance of the workflow route "V200TmpxrefrLoad" with an InputMessage from the local file "Desktop/" and XATTRs named "update", "effective", "taskid", and "batchid".

awilliam@beast01:~> curl -v -u fred -T Desktop/ ''
Enter host password for user 'fred':
* Hostname was NOT found in DNS cache
*   Trying
* Connected to ( port 80 (#0)
* Server auth using Basic with user 'fred'
> PUT /dav/Workflow/Routes/V200TmpxrefrLoad/InputMessage?update=2019-04-03&effective=2019-04-03&taskid=1063257439&batchid=v200-04/19 HTTP/1.1
> Authorization: Basic **************==
> User-Agent: curl/7.37.0
> Host:
> Accept: */*
> Content-Length: 44473641
> Expect: 100-continue
< HTTP/1.1 100 Continue
* We are completely uploaded and fine
< HTTP/1.1 301 Moved
* Server nginx/1.12.2 is not blacklisted
< Server: nginx/1.12.2
< Date: Tue, 09 Apr 2019 18:13:49 GMT
< Content-Type: application/octet-stream
< Content-Length: 0
< Connection: keep-alive
< Set-Cookie: OGOCOILSESSIONID=f9c4efe4-2091-4229-8ac7-68b6fd4a8478-13bb3be8-fae8-472b-9999-514eac324614-3cf33404-c58e-4727-bac1-1754711b9344;; expires=Wed, 10-Apr-2019 18:13:49 UTC; Path=/
< X-COILS-WORKFLOW-OUTPUT-URL: /dav/Workflow/Routes/V200TmpxrefrLoad/1065656529/output
< Location: /dav/Workflow/Routes/V200TmpxrefrLoad/1065656529/input
< X-COILS-WORKFLOW-MESSAGE-UUID: {688a86f2-3898-4d66-8c47-7393fa9fbad6}
* Connection #0 to host left intact

Success is indicated by an HTTP/301 response. The headers in the response provide important meta-data which may be of use to the client.

Header Description
X-COILS-WORKFLOW-OUTPUT-URL The URL to watch for the process' output message.
X-COILS-WORKFLOW-MESSAGE-LABEL The label assigned to the new message; this will typically be “InputMessage”.
X-COILS-WORKFLOW-PROCESS-ID The object id of the new process entity.

The priority of the new process can be set to a value other than the default of 201 using the URL parameter ".priority". The value must be a permissible integer priority value. Note that this parameter has a prefix of "." in order to distiguish it from an XATTR value.

In the circumstance where the creation of the processes is quashed by run control the response will be HTTP/202. The HTTP/202 response will have a header of X-COILS-WORKFLOW-ALERT with a value of “run-control violation” and the body of the response will describe the event.

by whitemice at April 09, 2019 06:25 PM

April 08, 2019

Whitemice Consulting

Informix: Listing The Locks

The current database locks in an Informix engine are easily enumerated from the sysmaster database.

  TRIM(s.username) AS user, 
  TRIM(l.dbsname) AS database, 
  TRIM(l.tabname) AS table,
  TRIM(l.type) AS type,
  s.sid AS session,
  l.rowidlk AS rowid
FROM sysmaster:syslocks l
  INNER JOIN sysmaster:syssessions s ON (s.sid = l.owner)
WHERE l.dbsname NOT IN('sysmaster')

The results are pretty straight forward:

User Database Type Session ID Row ID
extranet maindb site_master IS 436320|0
shuber maindb workorder IS 436353|0
shuber maindb workorder IX 436353|0
shuber maindb workorder_visit IS 436353|0
extranet maindb customer_master IS 436364|0
jkelley maindb workorder IX 436379|0
jkelley maindb workorder IS 436379|0
mwathen maindb workorder IS 436458|0

by whitemice at April 08, 2019 08:10 PM

September 26, 2018

As it were ...

get_options Topher Rap

My friends Kyle and Adam run a podcast together called get_options(). I am ashamed to admit I haven’t listened to any episodes (except one, you’ll see), but in my defense I don’t listen to any podcasts. I’ve been ON a few podcasts, but I didn’t even listen to those episodes.

Anyway, I was talking with Kyle recently and he said “Did you hear the rap I made for you?”.  I had not. I’ve seen Kyle and Adam rap together before, so I knew it could be done, but it never occurred to me that I would be the subject of one of these raps. Yet apparently I was.  It was in honor of me getting a new job. In episode 60 Kyle breaks out the rap.  Here’s the link to the episode, and here’s the Soundcloud clip of just the rap:

The post get_options Topher Rap appeared first on As it were....

by topher at September 26, 2018 12:08 AM

September 14, 2018

As it were ...

Austin, Texas

This summer I’ve had the pleasure of visiting Austin twice. The first trip started on my birthday in July and lasted 4 days. The second trip was in August and lasted 2 weeks. Both times were for work, and both times I stayed at “extended stay” hotels, which means they had full kitchens, and much more robust laundry utilities available.  I don’t remember the name of the first place I stayed, but the second place was called Home2 and is NICE. The laundry facilities were top notch (see images below), the staff were very nice, breakfast was decent every morning. The pool was quite cool.  It’s outside, and saline instead of chlorine. My only regret is that there was no hot tub.

I mostly didn’t go Out. I was at work all day, and mostly worked or studied in my apartment all evening.  One evening I did go out with my boss Travis and 3 guys all named Nate to a place called Perry’s Steakhouse. I had absolutely without question the best steak I’ve ever had in my life. It was absolutely incredible.

I’m sure I’ll get back there, my office is there and some friends are there. I’ll try to do more pictures then.

Here are some pictures from the trip.

Gallery 2

The post Austin, Texas appeared first on As it were....

by topher at September 14, 2018 12:52 AM

September 08, 2018

Whitemice Consulting

Reading BYTE Fields From An Informix Unload

Exporting records from an Informix table is simple using the UNLOAD TO command. This creates a delimited text file with a row for each record and the fields of the record delimited by the specified delimiter. Useful for data archive the files can easily be restored or processed with a Python script.

One complexity exists; if the record contains a BYTE (BLOB) field the contents are dumped hex encoded. This is not base64. To read these files take the hex encoded string value and decode it with the faux code-page hex: content.decode("hex")

The following script reads an Informix unload file delimited with pipes ("|") decoding the third field which was of the BYTE type.

rfile = open(ARCHIVE_FILE, 'r')
counter = 0
row = rfile.readline()
while row:
    counter += 1
        'row#{0} @ offset {1}, len={2}'
        .format(counter, rfile.tell(), len(row), )
    blob_id, content, mimetype, filename, tmp_, tmp_ = row.split('|')
    content = content.decode("hex")
    print('  BLOBid#{0} "{1}" ({2}), len={3}'.format(
        blob_id, filename, mimetype, len(content)
    if mimetype == 'application/pdf':
        if '/' in filename:
            filename = filename.replace('/', '_')
        wfile = open('wds/{0}.{1}.pdf'.format(blob_id, filename, ), 'wb')

by whitemice at September 08, 2018 08:05 PM

July 03, 2018

As it were ...

A Dream Job?

You may recall that in January of 2017 I started a grand experiment with Tanner Moushey. As experiments go, it was a great success, which is to say we learned a lot. As businesses go, it lasted until Feb of 2018. It was a great experience, and I learned a lot, and it paid the bills for a year, but as any entrepreneur will tell you, it’s a stressful life.

So after February I started looking for a Real Job. I applied to a number of places that didn’t even respond (one of which had approached me first!). I did two trials at Automattic and washed out of both of them. That was a great learning experience as well.

Spring faded into summer, and I was doing contract work to keep bread on the table, but that was getting old.

Then one night at 10pm a couple weeks ago my friend Luke sent me a Slack note, saying he knew of a large company looking for a WordPress evangelist, would I be interested? If you know anything about me then you know I was immediately interested.

He told me a little about it on the spot, but he was in a meeting with them in Sydney at the time (hence 10pm my time). I was a little wary at first. This sounded REALLY good, and I’d already been disappointed by other things this summer.

The next morning I sent an email to The Guy at The Company and we arranged to talk when he got back to Austin.  He basically went from plane ride from Sydney to a meeting with me to jury duty, all in one day. Iron man.

We talked for about 30 min and they said they were sending me an offer as quickly as possible.  Five days later I had an offer and accepted it!

So now I’m the WordPress Developer Evangelist for BigCommerce.  “But wait!” you say. “They don’t do WordPress do they?”.  For the unaware, BigCommerce is a hosted ecommerce solution. You sign up, pay the fee, and *poof* you have a store. Well, recently they decided to get into WordPress, big time. You can read about it here and here.

I’m crazy excited of course. I’ve been looking for a WordPress evangelist job for years, but beyond that I’m also really excited about the product. I know who built it, and I know who’s code reviewing it. I’ve been assured by people I trust that they’re putting the appropriate time and money into this project, and it should be really really solid. The number of good WordPress ecommerce plugins is really low, and some serious competition will only be a good thing I think.

So maybe I’ll be seeing you at a WordCamp soon! Feel free to ask me all the questions.

The post A Dream Job? appeared first on As it were....

by topher at July 03, 2018 02:56 PM

May 29, 2018

Whitemice Consulting

Disabling Transparent Huge Pages in CentOS7

The THP (Transparent Huge Pages) feature of modern LINUX kernels is a boon for on-metal servers with a sufficiently advanced MMU. However they can also result in performance degradation and inefficiently memory use when enabled in a virtual machine [depending on the hypervisor and hosting provider]. See, for example "Use of large pages can cause memory to be fully allocated". If you are issues in a virtualized environment that point towards unexplained memory consumption it may be worthwhile to experiment with disabling THP in your guests. These are instructions for controlling the THP feature through the use of a SystemD unit.

Create the file /etc/systemd/system/disable-thp.service:

Description=Disable Transparent Huge Pages (THP)
ExecStart=/bin/sh -c "echo 'never' > /sys/kernel/mm/transparent_hugepage/enabled && echo 'never' > /sys/kernel/mm/transparent_hugepage/defrag"

Enable the new unit:

sudo systemctl daemon-reload
sudo systemctl start disable-thp
sudo systemctl enable disable-thp

THP will now be disabled. However already allocated huge pages are still active. Rebooting the server is advised to bring up the services with THP disabled.

by whitemice at May 29, 2018 07:30 PM

May 06, 2018

Whitemice Consulting

Informix Dialect With CASE Derived Polymorphism

I ran into an interesting issue when using SQLAlchemy 0.7.7 with the Informix dialect. In a rather ugly database (which dates back to the late 1980s) there is a table called "xrefr" that contains two types of records: "supersede" and "cross". What those signify doesn't really matter for this issue so I'll skip any further explanation. But the really twisted part is that while a single field distinquishes between these two record types - it does not do so based on a consistent value. If the value of this field is "S" then the record is a "supersede", any other value (including NULL) means it is a "cross". This makes creating a polymorphic presentation of this schema a bit more complicated. But have no fear, SQLAlchemy is here!

When faced with a similar issue in the past, on top of PostgreSQL, I've created polymorphic presentations using CASE clauses. But when I tried to do this using the Informix dialect the generated queries failed. They raised the dreaded -201 "Syntax error or access violation" message.

The Informix SQLCODE -201 is in the running for "Most useless error message ever!". Currently it is tied with PHP's "Stack Frame 0" message. Microsoft's "File not found" [no filename specified] is no longer in the running as she is being held at the Hague to face war crimes charges.

Rant: Why do developers get away with such lazy error messages?

The original [failing] code that I tried looked something like this:

    class XrefrRecord(Base):
        __tablename__  = 'xrefr'
        record_id      = Column("xr_serial_no", Integer, primary_key=True)
        _supersede     = Column("xr_supersede", String(1))
        is_supersede   = column_property( case( [ ( _supersede == 'S', 1, ), ],
                                                else_ = 0 ) )

        __mapper_args__ = { 'polymorphic_on': is_supersede }   

    class Cross(XrefrRecord): 
        __mapper_args__ = {'polymorphic_identity': 0} 

    class Supsersede(XrefrRecord): 
        __mapper_args__ = {'polymorphic_identity': 1}

The generated query looked like:

      SELECT xrefr.xr_serial_no AS xrefr_xr_serial_no,
               WHEN (xrefr.xr_supersede = :1) THEN :2 ELSE :3
               END AS anon_1
      FROM xrefr
      WHERE xrefr.xr_oem_code = :4 AND
            xrefr.xr_vend_code = :5 AND
              WHEN (xrefr.xr_supersede = :6) THEN :7
              ELSE :8
             END IN (:9) &lt;--- ('S', 1, 0, '35X', 'A78', 'S', 1, 0, 0)

At a glance it would seem that this should work. If you substitute the values for their place holders in an application like DbVisualizer - it works.

The condition raising the -201 error is the use of place holders in a CASE WHEN structure within the projection clause of the query statement; the DBAPI module / Informix Engine does not [or can not] infer the type [cast] of the values. The SQL cannot be executed unless the values are bound to a type. Why this results in a -201 and not a more specific data-type related error... that is beyond my pay-grade.

An existential dilemma: Notice that when used like this in the projection clause the values to be bound are both input and output values.

The trick to get this to work is to explicitly declare the types of the values when constructing the case statement for the polymorphic mapper. This can be accomplished using the literal_column expression.

    from sqlalchemy import literal_column

    class XrefrRecord(Base):
        _supersede    = Column("xr_supersede", String(1))
        is_supersede  = column_property( case( [ ( _supersede == 'S', literal_column('1', Integer) ) ],
                                                   else_ = literal_column('0', Integer) ) )

        __mapper_args__     = { 'polymorphic_on': is_supersede }

Visually if you log or echo the statements they will not appear to be any different than before; but SQLAlchemy is now binding the values to a type when handing the query off to the DBAPI informixdb module.

Happy polymorphing!

by whitemice at May 06, 2018 08:23 PM

Sequestering E-Mail

When testing applications one of the concerns is always that their actions don't effect the real-world. One aspect of that this is sending e-mail; the last thing you want is the application you are testing to send a paid-in-full customer a flurry of e-mails that he owes you a zillion dollars. A simple, and reliable, method to avoid this is to adjust the Postfix server on the host used for testing to bury all mail in a shared folder. This way:

  • You don't need to make any changes to the application between production and testing.
  • You can see the message content exactly as it would ordinarily have been delivered.

To accomplish this you can use Postfix's generic address rewriting feature; generic address rewriting processes addresses of messages sent [vs. received as is the more typical case for address rewriting] by the service. For this example we'll rewrite every address to using a regular expression.


Create the regular expression map. Maps are how Postfix handles all rewriting; a match for the input address is looked for in the left hand [key] column and rewritten in the form specified by the right hand [value] column.

echo "/(.)/ " &gt; /etc/postfix/generic.regexp


Configure Postfix to use the new map for generic address rewriting.

postconf -e smtp_generic_maps=regexp:/etc/postfix/generic.regexp


Tell Postfix to reload its configuration.

postfix reload

Now any mail, to any address, sent via the hosts' Postfix service, will be driven not to the original address but to the shared "myfolder" folder.

by whitemice at May 06, 2018 08:11 PM

April 22, 2018

Whitemice Consulting

LDAP extensibleMatch

One of the beauties of LDAP is how simply it lets the user or application perform searching. The various attribute types hint how to intelligently perform searches such as case sensitivity with strings, whether dashes should be treated as relevant characters in the case of phone numbers, etc... However, there are circumstances when you need to override this intelligence and make your search more or less strict. For example: in the case of case sensitivity of a string. That is the purpose of the extensibleMatch.

Look at this bit of schema:

attributetype ( NAME 'name'
EQUALITY caseIgnoreMatch
SUBSTR caseIgnoreSubstringsMatch
SYNTAX{32768} )
attributetype ( NAME ( 'sn' 'surname' )
DESC 'RFC2256: last (family) name(s) for which the entity is known by'
SUP name )

The caseIgnoreMatch means that searches on attribute "name", or its descendant "sn" (used in the objectclass inetOrgPerson), are performed in a case insensitive manner. So...

estate1:~ # ldapsearch -Y DIGEST-MD5 -U awilliam sn=williams dn
SASL/DIGEST-MD5 authentication started
Please enter your password:
SASL username: awilliam
SASL installing layers
# Adam Williams, People, Entities, SAM,
dn: cn=Adam Williams,ou=People,ou=Entities,ou=SAM,dc=whitemice,dc=org
# Michelle Williams, People, Entities, SAM,
dn: cn=Michelle Williams,ou=People,ou=Entities,ou=SAM,dc=whitemice,dc=org

... this search returns two objects where the sn value is "Williams" even though the search string was "williams".

If for some reason we want to match just the string "Williams", and not the string "williams" we can use the extensibleMatch syntax.

estate1:~ # ldapsearch -Y DIGEST-MD5 -U awilliam "(sn:caseExactMatch:=williams)" dn
SASL/DIGEST-MD5 authentication started
Please enter your password:
SASL username: awilliam
search: 3
result: 0 Success
estate1:~ #

No objects found as both objects have "williams" with an initial capital letter.

Using extensibleMatch I was able to match the value of "sn" with my own preference regarding case sensitivity. The system for an extensibleMatch is "({attributename}:{matchingrule}:{filterspec})". This can be used inside a normal LDAP filter along with 'normal' matching expressions.

For more information on extensibleMatch see RFC2252 and your DSA's documentation [FYI: Active Directory is a DSA (Directory Service Agent), as is OpenLDAP, or

by whitemice at April 22, 2018 03:14 PM

Android, SD cards, and exfat

I needed to prepare some SD cards for deployment to Android phones. After formatting the first SD card in a phone I moved it to my laptop and was met with the "Error mounting... unknown filesystem type exfat" error. That was somewhat startling as GVFS gracefully handles almost anything you throw at it. Following this I dropped down to the CLI to inspect how the SD card was formatted.

awilliam@beast01:~> sudo fdisk -l /dev/mmcblk0
Disk /dev/mmcblk0: 62.5 GiB, 67109912576 bytes, 131074048 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: dos
Disk identifier: 0x00000000

Device         Boot Start       End   Sectors  Size Id Type
/dev/mmcblk0p1 *     2048 131074047 131072000 62.5G  7 HPFS/NTFS/exFAT

Seeing the file-system type I guessed that I was missing support for the hack that is exFAT [exFAT is FAT tweaked use on large SD cards]. A zypper search exfat found two uninstalled packages; GVFS is principally an encapsulation of fuse that adds GNOME awesome into the experience - so the existence of a package named "fuse-exfat" looked promising.

I installed the two related packages:

awilliam@beast01:~> sudo zypper in exfat-utils fuse-exfat
(1/2) Installing: exfat-utils-1.2.7-5.2.x86_64 ........................[done]
(2/2) Installing: fuse-exfat-1.2.7-6.2.x86_64 ........................[done]
Additional rpm output:
Added 'exfat' to the file /etc/filesystems
Added 'exfat_fuse' to the file /etc/filesystems

I removed the SD card from my laptop, reinserted it, and it mounted. No restart of anything required. GVFS rules! At this point I could move forward with rsync'ing the gigabytes of documents onto the SD card.

It is also possible to initially format the card in the openSUSE laptop as well. Partition the card creating a partition of type "7" and then use mkfs.exfat to format the partition. Be careful to give each card a unique ID using the -n option.

awilliam@beast01:~> sudo mkfs.exfat  -n 430E-2980 /dev/mmcblk0p1
mkexfatfs 1.2.7
Creating... done.
Flushing... done.
File system created successfully.

The mkfs.exfat command is provided by the exfat-utils package; a filesystem-utils package exists for most (all?) supported file-ystems. These -utils packages provide the various commands to create, check, repair, or tune the eponymous file-ystem type.

by whitemice at April 22, 2018 02:34 PM

April 03, 2018

Whitemice Consulting


After downloading a Virtualbox ready ISO of OpenVAS the newly created virtual machine to host the instance failed to start with an VERR_PDM_DEVHLPR3_VERSION_MISMATCH error. The quick-and-dirty solution was to set the instance to use USB 1.1. This setting is changed under Machine -> Settings -> USB -> Select USB 1.1 OHCI Controller.. After that change the instance now boots and runs the installer.

openSUSE 42.3 (x86_64)

by whitemice at April 03, 2018 12:21 PM

March 11, 2018

Whitemice Consulting

AWESOME: from-to Change Log viewer for PostgreSQL

Upgrading a database is always a tedious process - a responsible administrator will have to read through the Changelog for every subsequent version from the version ze is upgrading from to the one ze is upgrading to.

Then I found this! This is a Changelog viewer which allows you to select a from and a to version and shows you all the changelogs in between; on one page. You still have to read it, of course, but this is a great time saver.

by whitemice at March 11, 2018 01:15 AM

February 09, 2018

As it were ...

Get Array neighbors in PHP

I recently had an issue where I had a custom post type of Artist, and another of Artwork. When looking at a single piece of Artwork, I used posts2posts to get the related Artist, and then I also did a query to get an array of all of the other Artwork by that Artist. I used that array to render them as thumbnails below the main Artwork.

The related Artwork array really isn’t sorted in any way. It’s a standard post array, with incremental keys.

I needed to put links on the page to Previous and Next Artworks, like this:

Screenshot showing next and prev links.

Initially I used WordPress’ built in functions for previous and next post, but that relied on the chronology of all Artworks, irrespective of Artist, so they immediately left the current Artist and went to something unrelated.

To get the array I wanted, I took my standard posts array and did this:

// get a list of all of the IDs of that other art
$art_list = wp_list_pluck( $connected_art, 'post_title', 'ID' );

which got me a very concise array of array keys matching my post IDs. The post_title is a red herring, I don’t use it.

I needed to take my Art array and get the ID of the post on either side of the current Artwork. I looked at prev() and next() but messing with the array pointer doesn’t work in a for loop, so it was a pain.

I found some code in the comments for the next() function that came close to what I wanted, but left some things to be desired. So I used it as a base and ended up with the function below.

 * Function to get array keys on either side of a given key. If the
 * initial key is first in the array then prev is null. If the initial
 * key is last in the array, then next is null.
 * If wrap is true and the initial key is last, then next is the first
 * element in the array.
 * If wrap is true and the initial key is first, then prev is the last
 * element in the array.
 * @param array $arr
 * @param string $key
 * @param bool $wrap
 * @return array $return
function array_neighbor( $arr, $key, $wrap = false ) {

	krsort( $arr );
	$keys       = array_keys( $arr );
	$keyIndexes = array_flip( $keys );

	$return = array();
	if ( isset( $keys[ $keyIndexes[ $key ] - 1 ] ) ) {
		$return['prev'] = $keys[ $keyIndexes[ $key ] - 1 ];
	} else {
		$return['prev'] = null;

	if ( isset( $keys[ $keyIndexes[ $key ] + 1 ] ) ) {
		$return['next'] = $keys[ $keyIndexes[ $key ] + 1 ];
	} else {
		$return['next'] = null;

	if ( false != $wrap && empty( $return['prev'] ) ) {
		$end            = end( $arr );
		$return['prev'] = key( $arr );

	if ( false != $wrap && empty( $return['next'] ) ) {
		$beginning      = reset( $arr );
		$return['next'] = key( $arr );

	return $return;

Then you get your data with something like this, where $current_art is just the current post ID.

// grab the IDs of the art on either side of this one
$art_neighbors = array_neighbor( $art_list, $current_art, true );

The output looks like this:

    [prev] => 2257
    [next] => 2253

Those are post IDs, so I was able to simply drop those into get_permalink() for my next/prev links.

The post Get Array neighbors in PHP appeared first on As it were....

by topher at February 09, 2018 02:57 PM

January 17, 2018

Whitemice Consulting

Discovering Informix Version Via SQL

It is possible using the dbinfo function to retrieve the engine's version information via an SQL command:

select dbinfo('version','full') from sysmaster:sysdual

which will return a value like:

IBM Informix Dynamic Server Version 12.10.FC6WE

by whitemice at January 17, 2018 08:56 PM

December 28, 2017

OpenGroupware (Legacy and Coils)

ODBC Support Added To OIE

As of OpenGroupware Coils 0.1.49r112 support for ODBC data sources has been integrated into OIE. These SQL data sources are defined in the OIESQLSources just as PostgreSQL and Informix database connections are. This feature requires the pyodbc module to be installed. The availability of this module on your workflow node can be verified using the coils-dependency-check tool.

[ ~]# coils-dependency-check 
OK: Module markdown (Markdown rendering, required for /wiki protocol) available.
OK: Module vobject (vCard and vEvent parsing) available.
OK: Module zope.interface (ZOPE Interfaces for RML engine) available.
OK: Module xlrd (XLS<2007 read support) available.
OK: Module pycups (IPP printing support) available.
OK: Module paramiko (SSH suppport.) available.
OK: Module dateutil (Date & Time Arithmatic) available.
OK: Module lxml (SAX & DOM XML Processing) available.
OK: Module Pillow (Python Imaging Library) available.
OK: Module psycopg2 (PostgreSQL RDBMS connectivity) available.
OK: Module base64 (Encode and decode Base64 data) available.
OK: Module yaml (YAML parser & serializer) available.
OK: Module pyodbc (ODBC SQL connectivity) available.   <<<<<<<<<<
OK: Module xlwt (XLS<2007 write support) available.
OK: Module sqlalchemy (Object Relational Modeling) available.
OK: Module pytz (Python Time Zone tables) available.
OK: Module smbc (SMB/CIFS integration) available.
OK: Module argparse (Enhanced argument parsing, required for /wiki protocol) available.
OK: Module ijson (Streaming JSON parser, requires libyajl) available.
OK: Module z3c.rml (RML Generator, also requires "zope.interface") available.
OK: Module (Streaming XML Creation) available.
OK: Module (Simple PDF Operations) available.
OK: Module untangle (XML parsing) available.
OK: Module gnupg (GPG/PGP suppport.) available.
OK: Module informixdb (Informix RDBMS connectivity) available.

The principle use for the ODBC connection is to connect to M$-SQL database engines. In order to make ODBC connections the proper ODBC driver must be installed on the node and properly configured.

ODBC database connections are defined in the OIESQLSources configuration directive just as with PostgreSQL and Informix database connections. The driver must be "odbc" and the parameter "DSN" be the complete ODBC connection string.

coils-server-config --directive=OIESQLSources  --value='{
  "acumaticaMVP1": {"driver": "odbc",
                    "DSN": "DSN=AcumaticaDB;UID=oie-workflow-account;PWD=XXXXXXXXXXXXXXXXXXXXXXXXXX"}

A defined connection can be tested using coils-test-sql utility. This test is best performed form the node hosting the coils.workflow.executor component.

[ ~]# coils-test-sql --name=acumaticaMVP1 --table=ARInvoice
Store root is /var/lib/
Connected to SQL "acumaticaMVP1"
  Select Table: "ARInvoice"

If the connection works then it is ready to be used from workflow actions like sqlSelectAction and sqlExecuteAction.

by whitemice at December 28, 2017 02:43 PM

October 09, 2017

Whitemice Consulting

Failure to apply LDAP pages results control.

On a particular instance of OpenGroupware Coils the switch from an OpenLDAP server to an Active Directory service - which should be nearly seamless - resulted in "Failure to apply LDAP pages results control.". Interesting, as Active Directory certainly supports paged results - the 1.2.840.113556.1.4.319 control.

But there is a caveat! Of course.

Active Directory does not support the combination of the paged control and referrals in some situations. So to reliably get the page control enable it is also necessary to disable referrals.

dsa = ldap.initialize(config.get('url'))
dsa.set_option(ldap.OPT_PROTOCOL_VERSION, 3)
dsa.set_option(ldap.OPT_REFERRALS, 0)

Disabling referrals is likely what you want anyway, unless you are going to implement referral following. Additionally, in the case of Active Directory the referrals rarely reference data which an application would be interested in.

The details of Active Directory and pages results + referrals can be found here

by whitemice at October 09, 2017 03:03 PM

August 31, 2017

Whitemice Consulting

opensuse 42.3

Finally got around to updating my work-a-day laptop to openSUSE 42.3. As usual I did an in-place distribution update via zypper. This involves replacing the previous version repositories with the current version repositories - and then performing a dup. And as usual the process was quick and flawless. After a reboot everything just-works and I go back to doing useful things. This makes for an uninteresting BLOG post, which is as it should be.

zypper lr --url
zypper rr
zypper rr packman
zypper rr repo-non-oss
zypper rr repo-oss
zypper rr repo-update-non-oss
zypper rr repo-update-oss
zypper rr server:mail
zypper ar repo-non-oss
zypper ar repo-oss
zypper ar server:mail
zypper ar repo-update-non-oss
zypper ar repo-update-oss
zypper ar packman
zypper lr --url  # double check
zypper ref  # refesh
zypper dup --download-in-advance  # distribution update
zypper up  # update, just a double check


by whitemice at August 31, 2017 12:49 PM

August 07, 2017

OpenGroupware (Legacy and Coils)

An Introduction to OIE Tables

The OIE Table entity provides a simple means to embed look-ups, filters, and translations into workflows. The principle of a Table is that it always receives a value and returns a value - a look-up.

Table definitions are presented via WebDAV in the /dav/Workflow/Tables folder as simple YAML files; they can be created and edited using your favorite text editor. If you are familiar with OIE Format definitions the Table definition should seem very familiar. Tables are identified by their unique name which is specified by the name attribute of their YAML description.


The static look-up table provides a method to do simple recoding of data without relying on external data-sources such as an LDAP DSA or SQL RDBMS. The definition of a StaticLookupTable provides a values dictionary where input values are looked up and the corresponding value returned. The optional defaultValue directive may specify a value to be returned if the input value is not found in the values table; if no defaultValue is specified the table will return a None.

class: StaticLookupTable
defaultValue: 9
values: { 'ME1932': 4,
          'Kalamazoo': 'abc' }
name: TestStaticLookupTable

Text 1: A StaticLookupTable that returns 4 for the input value "ME1932", and "abc" for the input value "Kalamazoo". Any other input value results in the value 9.


A presence look-up table contains a list of static values. It returns a specified value if the input value matches one of the values stored in the table; otherwise it returns an alternative value. Presence look-up tables are most commonly used when a small and known set of values needs to used to filter a set of data.

class: PresenceLookupTable
name: BankeCodeExclusionTable
returnValueForFalse: true
returnValueForTrue: false
values: [ME1932, Kalamazoo, 123]

Text 2: A PresenceLookupTable that returns boolean false for the input values "ME1932", "Kalamazo", and 123; returns boolean true for all other input values.


An SQLLookupTable permits the translation or look-up of values using an SQL data source defined in the OIESQLSources server default. The table definition must at the minimum define SQLQueryText and SQLDataSourceName directives. Within the SQLQueryText value the "?" is substituted for the input value; the first column of the query result is the return value of the table. If the query identifies no rows then a None value is returned from the table.

SQLDataSourceName: mydbconnection
    ''True'' ELSE ''False'' END  FROM bank_code_exclusion WHERE bank_code
    = ? AND ex_service_followup = ''Y'';'
class: SQLLookupTable
doInputStrip: true
doInputUpper: true
doOutputStrip: false
doOutputUpper: false
name: ServiceFollowUpExclusionTable,
useSessionCache: true

Text 3: An example SQLLookup table which uses the data-source "mydbconnection" as defined in the OIESQLDataSources server default.

The optional directives: doInputStrip, doInputUpper, doOutputStrip, and doOuputUpper, which all default to false, allow the input and the output values to be changed to upper case and stripped of white-space. Converting a value to upper case may be useful in the case where a database backend itself does not support case-insensitive compare. Trimming whitespace on input values can protect from attempting to look-up padded strings and output trimming is useful for database engines that always return strings values defined like CHAR(30) as padded values.

Using Tables

In Python code using a table is as simple as loading the class and calling the lookup_value method. However the Table performs the look-up is entirely encapsulated in appropriate Table class [SQLLookupTable, StaticLookupTable, ...]

table = Table.Load(name)
return table.lookup_value(value)

Text 4: How to use a table to look-up values in Python code.

More commonly Table lookups are going to be performed within workflow actions such as maps and transforms. When performing an XSLT transform using any table is available via the tablelookup OIE extension method; this allows values from the input stream to be easily used as lookup-values facilitating translation of ERP and other codes/abbreviations between disparate applications.

<xsl:template match="row">
<xsl:if test="total_charges>1000">
<xsl:variable name="include" select="oie:tablelookup('ServiceFollowUpExclusionTable',string(bank_code))"/>
<xsl:if test="$include='True'">

Text 5: This snippet of an XSLT transform demonstrates how to use a Table lookup from with an stylesheet.

Overall Tables provide a simple and elegant way to automate all the codes that need to be inserted and translated in the wide variety of documents processed by the workflow engine as well as providing a means to easily implement dynamic filtering.

Author: Adam Tauno Williams

by whitemice at August 07, 2017 10:25 AM

Invoking an OIE Route from PHP

The repository not contains a PHP class making it simple to invoke an OIE workflow from PHP. See the oie.php file. Using the OIEProcess class defined in the file processes can be created and the process id and input message UUID known.

$HTTPROOT   = "";
$ROUTENAME  = "TEST_MailBack";
$PARAMETERS = array('myParameter'=>'YOYO MAMA', 'otherParam'=>4);
$request = new OIEProcess($HTTPROOT, $ROUTENAME, $PARAMETERS);
if ($request->start('adam', '*******', fopen('/etc/passwd', 'r'), 'text/plain') == 'OK') {
    echo "\n";
    echo "Process ID: " . $request->get_process_id() . "\n";
    echo "Message UUID: " . $request->get_message_id() . "\n";    

The start method returns either "OK", "OIEERR" (OIE refused the request), or "SYSERR" (the curl operation failed). The first and second parameters for start is the user credentials, the optional third and fourth parameters is the input message stream and the payload mimetype. If no mimetype is specified a default of "application/octet-stream" is assumed.

Author: Adam Tauno Williams

by whitemice at August 07, 2017 10:10 AM

The SMTP Listener

Similar to the coils.workflow.9100 service that can deliver raw socket connections into defined workflows OpenGroupware Coils also provides an SMTP listener. The listener enables workflows to receive messages via SMTP; simply configure your MTA (Mail Transfer Agent) to route some prefix such as "ogo" to your OpenGroupware Coils instance and then use plussed address syntax to deliver e-mail messages to specific objects.

Workflows can be invoked using ogo+wf+routeName@ syntax; for example to send an e-mail message to the workflow named ExampleStatusUpdate a message would be sent to In the case of a workflow the text/plain body of the message will become the input message for a new instance (Process) of that route. A ticket is open to implement support for receiving specific MIME-types attached to a message as the process' input message.

To target an entity with a message, assuming your delivery is routing ogo@ to the OpenGroupware Coils listener, send a message to where objectId is the numeric object id of the entity. In most cases the entity must allow read access to the NetworkContext (OGo#8999) via an ACE in the object's ACL. OpenGroupware Coils network service components interact with the object model with the NetworkContext, this security context has minimal access to the server's objects and content for obvious security reasons. Additional access must be deliberately granted to allow unathenticated services such as the socket and SMTP listener to interact with an object.

Document folders are one entity that supports receiving messages from the SMTP listener. In order to access the folder the NetworkContext must have read access to the folder entity and in order to actually store content in the folder it must have write access. For this reason it is recommended that a specific folder be created in a project for the purpose of receiving SMTP messages; from that folder a user, application, or workflow can relocate and possibly rename the documents.

For example, a message sent to, where OGo#1234567 is a document folder to which NetworkContexts has read/write permissions, will be stored in its raw form in that folder. Most document-oriented applications however cannot easily deal with raw e-mail messages [after all, they aren't e-mail clients]. Perhaps what you really need is some document that is attached to these e-mail messages? This is a common use case with document centers - they scan documents into PDF and delivery them via SMTP. In order to facilitate this use-case and to streamline document management the user or application can define the MIME type of the documents the folder should receive. If a MIME type is defined for SMTP collection on a folder than only that type of document, attached to a received message, will be saved - the attachments will be automatically saved from the message and the message itself discarded.

In order to define a MIME-type for SMTP collection on a folder create an object property in the namespace having an attribute name of collectMIMEType. The value of that property should be the MIME-type you desire to collection. For example, if {}collectMIMEType was defined on OGo#1234567 [from our previous example] having a value of "application/pdf" then only PDF attachments would saved to the folder. There are two special-case MIME-types:

  • message/rfc822 - This is the default type, and just as if the object property were not defined, will cause incoming messages to be saved in their entirety.
  • text/plain - This value will save the text/plain message body as a document in the folder.

On every document created by the SMTP listener a set of object properties will be created. These properties correspond to headers in the e-mail message from which the document was created; if a corresponding header does not exist in the e-mail message than the corresponding object property will not be created. The SMTP listener defines a set of interesting headers; if you believe there are headers that should be captured but are not included in this list feel free to request the addition of the header via the projects' ticket application of SourceForge.

The currently defined list of object properties created from message headers are:

  • {us.opengroupware.mail.header}subject
  • {us.opengroupware.mail.header}x-spam-level
  • {us.opengroupware.mail.header}from
  • {us.opengroupware.mail.header}to
  • {us.opengroupware.mail.header}date
  • {us.opengroupware.mail.header}x-spam-status
  • {us.opengroupware.mail.header}reply-to
  • {us.opengroupware.mail.header}x-virus-scanned
  • {us.opengroupware.mail.header}x-bugzilla-classification
  • {us.opengroupware.mail.header}x-bugzilla-product
  • {us.opengroupware.mail.header}x-bugzilla-component
  • {us.opengroupware.mail.header}x-bugzilla-severity
  • {us.opengroupware.mail.header}x-bugzilla-status
  • {us.opengroupware.mail.header}x-bugzilla-url
  • {us.opengroupware.mail.header}x-mailer
  • {us.opengroupware.mail.header}x-original-sender
  • {us.opengroupware.mail.header}mailing-list
  • {us.opengroupware.mail.header}list-id
  • {us.opengroupware.mail.header}x-opengroupware-regarding
  • {us.opengroupware.mail.header}x-opengroupware-objectid
  • {us.opengroupware.mail.header}x-original-to
  • {us.opengroupware.mail.header}in-reply-to
  • {us.opengroupware.mail.header}cc
  • {us.opengroupware.mail.header}x-gm-message-state
  • {us.opengroupware.mail.header}message-id

All documents created will have at least the property {us.opengroupware.mail.header}message-id as Message-ID is a required header [per RFC822]. The SMTP component will not process a message that lacks a Message-ID header. The Message-ID and a timestamp are used to create the documents filename.

In addition to these properties the property {}contentType used by the WebDAV presentation will also be set on created documents to store the original MIME-type.

These properties can be used to correlate or qualify the documents, and [of course] can be used as search qualifications when using zOGI's searchForObjects.

Document creation by SMTP provides for a very simple integration path with innumerable both consumer and enterprise level devices. From there your applications can easily access the documents by zOGI (JSON-RPC or XML-RPC), AttachFS (REST), or WebDAV.

Author: Adam Tauno Williams

by whitemice at August 07, 2017 10:06 AM

June 06, 2017

Whitemice Consulting

LDAP Search For Object By SID

All the interesting objects in an Active Directory DSA have an objectSID which is used throughout the Windows subsystems as the reference for the object. When using a Samba4 (or later) domain controller it is possible to simply query for an object by its SID, as one would expect - like "(&(objectSID=S-1-...))". However, when using a Microsoft DC searching for an object by its SID is not as straight-forward; attempting to do so will only result in an invalid search filter error. Active Directory stores the objectSID as a binary value and one needs to search for it as such. Fortunately converting the text string SID value to a hex string is easy: see the guid2hex(text_sid) below.

import ldap
import ldap.sasl
import ldaphelper

PDC_LDAP_URI = 'ldap://'
OBJECT_SID = 'S-1-5-21-2037442776-3290224752-88127236-1874'
LDAP_ROOT_DN = 'DC=example,DC=com'

def guid2hex(text_sid):
    """convert the text string SID to a hex encoded string"""
    s = ['\\{:02X}'.format(ord(x)) for x in text_sid]
    return ''.join(s)

def get_ldap_results(result):
    return ldaphelper.get_search_results(result)

if __name__ == '__main__':

    pdc = ldap.initialize(PDC_LDAP_URI)
    pdc.sasl_interactive_bind_s("", ldap.sasl.gssapi())
    result = pdc.search_s(
        '(&(objectSID={0}))'.format(guid2hex(OBJECT_SID), ),
        [ '*', ]
    for obj in [x for x in get_ldap_results(result) if x.get_dn()]:
        """filter out objects lacking a DN - they are LDAP referrals"""
        print('DN: {0}'.format(obj.get_dn(), ))


by whitemice at June 06, 2017 12:11 AM

April 08, 2017

As it were ...

Why I no longer hate GoDaddy

There was a time when I said “never GoDaddy”. I turned down contracts when the client wanted to be hosted on GoDaddy, and wouldn’t budge. Over the last few years my attitude has changed pretty dramatically. I’m happy to work with GoDaddy now, and I like what they’re doing as a company.

Recently a friend tweeted this:

That is absolutely a fair question, and I think one that deserves a better answer than a tweet back, so this post is intended to be that answer.

Why I Didn’t Like GoDaddy

My primary reason was their choice to use sex as a marketing tool. Every commercial made me cringe. I felt so sad that NASCAR’s first serious female contender was cast as someone sexy rather than someone with amazing accomplishments. There was so much opportunity there to inspire young women and girls with the idea that they can break cultural norms.

A secondary reason was the lifestyle of the owner. He simply made choices I don’t like. Lots of people do, and that’s fine, but I made the choice not to use his product.

There were also some tech issues I didn’t like.  For a long time you couldn’t get shell for example. That annoyed me like crazy.

Lastly, they were the biggest player. I always root for the underdog.

What Changed

The real change came when key people inside GoDaddy decided the company was doing harmful things, and decided to do something about it. The owner sold the company and took a smaller and smaller role in controlling the company until he was simply gone.

At that point the opportunity existed to take a higher road, and they did it. The sex came out of the commercials. There are now more women than men in positions of authority inside the company.

In general things have really turned around.

What Doesn’t Matter

I recently heard someone bad mouth GoDaddy, and then someone else jump in and say “How can you hate GoDaddy?  Mendel Kurland is such a cool guy!” For the unaware, Mendel works there. And he is a cool guy, I like him a lot. I have other friends that work there too.

None of that matters. My beef wasn’t with individual people there, but corporate direction.

So Everything’s Perfect?

No. There are still things I don’t like about GoDaddy. But those things are in the same class as things I don’t like about every host as well. They’re not using protocol X, or they meddle too much in the site creation, or whatever. They’re not anything that I would feel like I need to apologize to my daughter for.

In Summary

In the past I’ve been vocal about “never GoDaddy”. I’m not that way anymore.

The post Why I no longer hate GoDaddy appeared first on As it were....

by topher at April 08, 2017 10:13 PM

March 07, 2017

Whitemice Consulting

KDC reply did not match expectations while getting initial credentials

Occasionally one gets reminded of something old.

[root@NAS04256 ~]# kinit
Password for adam@Example.Com: 
kinit: KDC reply did not match expectations while getting initial credentials


[root@NAS04256 ~]# kinit adam@EXAMPLE.COM
Password for adam@EXAMPLE.COM:
[root@NAS04256 ~]# 

In some cases the case of the realm name matters.

by whitemice at March 07, 2017 02:18 PM

February 09, 2017

Whitemice Consulting

The BOM Squad

So you have a lovely LDIF file of Active Directory schema that you want to import using the ldbmodify tool provided with Samba4... but when you attempt the import it fails with the error:

Error: First line of ldif must be a dn not 'dn'
Modified 0 records with 0 failures

Eh? @&^$*&;@&^@! It does start with a dn: attribute it is an LDIF file!

Once you cool down you look at the file using od, just in case, and you see:

0000000   o   ;   ?   d   n   :  sp   c   n   =   H   o   r   d   e   -

The first line does not actually begin with "dn:" - it starts with the "o;?". You've been bitten by the BOM! But even opening the file in vi you cannot see the BOM because every tool knows about the BOM and deals with it - with the exception of anything LDIF related.

The fix is to break out dusty old sed and remove the BOM -

sed -e '1s/^\xef\xbb\xbf//' horde-person.ldf  > nobom.ldf

And double checking it with od again:

0000000   d   n   :  sp   c   n   =   H   o   r   d   e   -   A   g   o

The file now actually starts with a "dn" attribute!

by whitemice at February 09, 2017 12:09 PM

Installation & Initialization of PostGIS

Distribution: CentOS 6.x / RHEL 6.x

If you already have a current version of PostgreSQL server installed on your server from the PGDG repository you should skip these first two steps.

Enable PGDG repository

curl -O
rpm -ivh pgdg-centos93-9.3-1.noarch.rpm

Disable all PostgreSQL packages from the distribution repositories. This involves editing the /etc/yum.repos.d/CentOS-Base.repo file. Add the line "exclude=postgresql*" to both the "[base]" and "[updates]" stanzas. If you skip this step everything will appear to work - but in the future a yum update may break your system.

Install PostrgreSQL Server

yum install postgresql93-server

Once installed you need to initialize and start the PostgreSQL instance

service postgresql-9.3 initdb
service postgresql-9.3 start

If you wish the PostgreSQL instance to start with the system at book use chkconfig to enable it for the current runlevel.

chkconfig postgresql-9.3 on

The default data directory for this instance of PostgreSQL will be "/var/lib/pgsql/9.3/data". Note: that this path is versioned - this prevents the installation of a downlevel or uplevel PostgreSQL package destroying your database if you do so accidentally or forget to follow the appropriate version migration procedures. Most documentation will assume a data directory like "/var/lib/postgresql" [notably unversioned]; simply keep in mind that you always need to contextualize the paths used in documentation to your site's packaging and provisioning. Enable EPEL Repository

The EPEL repository provides a variety of the dependencies of the PostGIS packages provided by the PGDG repository.

curl -O
rpm -Uvh epel-release-6-8.noarch.rpm

Installing PostGIS

The PGDG package form PostGIS should now install without errors.

yum install postgis2_93

If you do not have EPEL successfully enables when you attempt to install the PGDG PostGIS packages you will see dependency errors.

--->; Package postgis2_93-client.x86_64 0:2.1.1-1.rhel6 will be installed
--> Processing Dependency: for package: postgis2_93-client-2.1.1-1.rhel6.x86_64
--> Finished Dependency Resolution
Error: Package: gdal-libs-1.9.2-4.el6.x86_64 (pgdg93)
Error: Package: gdal-libs-1.9.2-4.el6.x86_64 (pgdg93)
Error: Package: gdal-libs-1.9.2-4.el6.x86_64 (pgdg93)

Initializing PostGIS

The template database "template_postgis" is expected to exist by many PostGIS applications; but this database is not created automatically.

su - postgres
createdb -E UTF8 -T template0 template_postgis
-- ... See the following note about enabling plpgsql ...
psql template_postgis
psql -d template_postgis -f /usr/pgsql-9.3/share/contrib/postgis-2.1/postgis.sql
psql -d template_postgis -f /usr/pgsql-9.3/share/contrib/postgis-2.1/spatial_ref_sys.sql 

Using the PGDG packages the PostgreSQL plpgsql embedded language, frequently used to develop stored procedures, is enabled in the template0 database from which the template_postgis database is derived. If you are attempting to use other PostgreSQL packages, or have built PostgreSQL from source [are you crazy?], you will need to ensure that this language is enabled in your template_postgis database before importing the scheme - to do so run the following command immediately after the "createdb" command. If you see the error stating the language is already enabled you are good to go, otherwise you should see a message stating the language was enabled. If creating the language fails for any other reason than already being enabled you must resolve that issue before proceeding to install your GIS applications.

$ createlang -d template_postgis plpgsql
createlang: language "plpgsql" is already installed in database "template_postgis"


PostGIS is now enabled in your PostgreSQL instance and you can use and/or develop exciting new GIS & geographic applications.

by whitemice at February 09, 2017 11:43 AM

February 03, 2017

Whitemice Consulting

Unknown Protocol Drops

I've seen this one a few times and it is always momentarily confusing: on an interface on a Cisco router there is a rather high number of "unknown protocol drops". What protocol could that be?! Is it some type of hack attempt? Ambitious if they are shaping there own raw packets onto the wire. But, no, the explanation is the much less exciting, and typical, lazy ape kind of error.

  5 minute input rate 2,586,000 bits/sec, 652 packets/sec
  5 minute output rate 2,079,000 bits/sec, 691 packets/sec
     366,895,050 packets input, 3,977,644,910 bytes
     Received 15,91,926 broadcasts (11,358 IP multicasts)
     0 runts, 0 giants, 0 throttles
     0 input errors, 0 CRC, 0 frame, 0 overrun, 0 ignored
     0 watchdog
     0 input packets with dribble condition detected
     401,139,438 packets output, 2,385,281,473 bytes, 0 underruns
     0 output errors, 0 collisions, 3 interface resets
     97,481 unknown protocol drops  <<<<<<<<<<<<<<
     0 babbles, 0 late collision, 0 deferred

This is probably the result of CDP (Cisco Discovery Protocol) being enabled on one interface on the network and disabled in this interface. CDP is the unknown protocol. CDP is a proprietary Data Link layer protocol, that if enabled, sends an announcement out the interface every 60 seconds. If the receiving end gets the CDP packet and has "no cdp enable" in the interface configuration - those announcements count as "unknown protocol drops". The solution is to make the CDP settings, enabled or disabled, consistent on every device in the interface's scope.

by whitemice at February 03, 2017 06:32 PM

Screen Capture & Recording in GNOME3

GNOME3, aka GNOME Shell, provides a comprehensive set of hot-keys for capturing images from your screen as well as recording your desktop session. These tools are priceless for producing documentation and reporting bugs; recording your interaction with an application is much easier than describing it.

  • Alt + Print Screen : Capture the current window to a file
  • Ctrl + Alt + Print Screen : Capture the current window to the cut/paste buffer
  • Shift + Print Screen : Capture a selected region of the screen to a file
  • Ctrl + Shift + Print Screen : Capture a selected region of the screen to the cut/paste buffer
  • Print Screen : Capture the entire screen to a file
  • Ctrl + Print Screen : Capture the entire screen to the cut/paste buffer
  • Ctrl + Alt + Shift + R : Toggle screencast recording on and off.

Recorded video is in WebM format (VP8 codec, 25fps). Videos are saved to the ~/Videos folder and image files are saved in PNG format into the ~/Pictures folder. When screencast recording is enabled there will be a red recording indicator in the bottom right of the screen, this indicator will disappear one screencasting is toggled off again.

by whitemice at February 03, 2017 06:29 PM

Converting a QEMU Image to a VirtualBox VDI

I use VirtualBox for hosting virtual machines on my laptop and received a Windows 2008R2 server image from a consultant as a compressed QEMU image. So how to convert the QEMU image to a VirtualBox VDI image?

Step#1: Convert QEMU image to raw image.

Starting with the file WindowsServer1-compressed.img (size: 5,172,887,552)

Convert the QEMU image to a raw/dd image using the qemu-img utility.

emu-img convert  WindowsServer1-compressed.img  -O raw  WindowsServer1.raw

I now have the file WindowsServer1.raw (size: 21,474,836,480)

Step#2: Convert the RAW image into a VDI image using the VBoxManage tool.

VBoxManage convertfromraw WindowsServer1.raw --format vdi  WindowsServer1.vdi
Converting from raw image file="WindowsServer1.raw" to file="WindowsServer1.vdi"...
Creating dynamic image with size 21474836480 bytes (20480MB)...

This takes a few minutes, but finally I have the file WindowsServer1.vdi (size: 14,591,983,616)

Step#3: Compact the image

Smaller images a better! It is likely the image is already compact; however this also doubles as an integrity check.

VBoxManage modifyhd WindowsServer1.vdi --compact

Sure enough the file is the same size as when we started (size: 14,591,983,616). Upside is the compact operation went through the entire image without any errors.

Step#4: Cleanup and make a working copy.

Now MAKE A COPY of that converted file and use that for testing. Set the original as immutable [chattr +i] to prevent that being used on accident. I do not want to waste time converting the original image again.

Throw away the intermediate raw image and compress the image we started with for archive purposes.

rm WindowsServer1.raw 
cp WindowsServer1.vdi WindowsServer1.SCRATCH.vdi 
sudo chattr +i WindowsServer1.vdi
bzip2 -9 WindowsServer1-compressed.img 

The files at the end:

File Size
WindowsServer1-compressed.img.bz2 5,102,043,940
WindowsServer1.SCRATCH.vdi 14,591,983,616
WindowsServer1.vdi 14,591,983,616


Generate a new UUID for the scratch image. This is necessary anytime a disk image is duplicated. Otherwise you risk errors like "Cannot register the hard disk '/archive/WindowsServer1.SCRATCH.vdi' {6ac7b91f-51b6-4e61-aa25-8815703fb4d7} because a hard disk '/archive/WindowsServer1.vdi' with UUID {6ac7b91f-51b6-4e61-aa25-8815703fb4d7} already exists" as you move images around.

VBoxManage internalcommands sethduuid WindowsServer1.SCRATCH.vdi
UUID changed to: ab9aa5e0-45e9-43eb-b235-218b6341aca9

Generating a unique UUID guarantees that VirtualBox is aware that these are distinct disk images.

Versions: VirtualBox 5.1.12, QEMU Tools 2.6.2. On openSUSE LEAP 42.2 the qemu-img utility is provided by the qemu-img package.

by whitemice at February 03, 2017 02:36 PM