Looking for magic in all the right places

Having three kids, all six and under, I get pelted with all kinds of kid’s shows. Today, my youngest was having his milk while sitting in his seat, watching Sofia the First. I was working on the sofa, but somehow digested the whole episode.

CedricWhat caught my fancy was how my fantasy-writing nature digested all of the magical rules this episode put forth. The often bumbling royal sorcerer (Mr. Cedric) was forced to save the magic school from some pranksters, but was stuck in a spell that bound him and his hands to a chair. He couldn’t wave his wand and hence cast no spell.

Observing this hard magical rule of their universe, made me start mentally flipping through other magical systems, and seeing if I could spot their rules, and decide whether or not I liked them. Feels like a way to objectively look at my own magic systems and see if there are any big gaps to address.

harry-potter-wandThe first thing that came to mind is Harry Potter. I read the first book and have seen all the movies. (I promise I’ll circle back and read the other books at some time). Harry Potter appears to be a wand-only environment as well. One of the first spells you see (in the movies anyway) is Expelliarmus, the one that knocks your opponent’s wand out of their hand. More advanced casters actually manage to catch the wand.

I know what you’re thinking. What about potions and scrolls? Okay, those also exist. But it appears that in these systems, you can NOT do a spell all by your lonesome. You need a medium to help you.

doctor-strange-02-03Which brings me to my third example. Doctor Strange. As a big time collector from long ago, I am quite familiar with their system of magic. And it is QUITE different. Instead of being wand-only, they have different levels of magic. They tap magical energy from themselves, the surrounding universe, and the most powerful ones, from other dimensional entities. I have really enjoyed the conflicts and challenges this system posed. In one issue, all the “good” entities cash in, demanding Doctor Strange come to serve in exchange for having let him use their energy.

If you are going to write fantasy and have magic, you must put time and thought into it. If it’s a cornerstone of your novel (and they often are), five minutes of thought won’t cut it. But don’t view it as laborious and tortuous. Instead, the more time you invest, the better it will arise and bear fruit. I find much excitement when members of my critique group pelt me with questions.

Demo app built for #LearningSpringBoot

learning-spring-boot-2nd-edition-mockI’ve spent the last few days building the app I will use in the new Learning Spring Boot video. I was able to actually move quickly thanks to the power of Spring Boot, Spring Data, and some other features I’ll dive into in more detail in the video itself.

It’s really fun to sit down and BUILD an application with Spring Boot. The Spring Framework has always been fun to develop apps. But Spring Boot takes that fun and excitement to a whole new level. The hardest part was saying, “this is all I need. This covers the case” and stopping. While I have it mostly in place. Just a few bits left.

Having been inspired a couple years ago by this hilarious clip, I sought the most visually stunning and audacious demo I could think of: snapping a picture and uploading for others to see.

springone2gx2014_banner_speaking_125x125I created one based on Spring Data REST for various conferences and eventually turned it into a scaleable microservice based solution. (See my videos on the side bar for examples of that!) That was really cool and showed how far you could get with a domain model definition. But that path isn’t the best route for this video. So I am starting over.

platform-spring-bootOnce again, I take the same concept, but rebuild it from scratch using the same focus I used in writing the original Learning Spring Boot book. Take the most often used bits of Spring (Spring MVC, Spring Data, and Spring Security), and show how Spring Boot accelerates the developer experience while prepping you for Real World ops.

DSC05391I have chatted with many people including those within my own company, and those outside. They have all given me a consistent message: it helped them with some of the most common situations they face on a daily basis. That’s what I was shooting for!  Hopefully, this time around I can trek along the same critical path but cover ground that was missed last time. And also prune out things that turned out to be mistakes. (e.g. This time around, I’m not spending as time on JavaScript.)

fight1Now it’s time to knuckle down and lay out the scripts and start recording video. That part will be new ground. I have recorded screencasts and webinars. But not on this scale. I hope to build the best video for all my readers (and soon to be viewers).

Wish me luck!

 

Awesome pinball invitational @RavenTools last night! #pinball

Cirqus_Voltaire_pinballLast night was phenomenal! In listening to the inaugural episode of NashDev podcast, I heard word of a pinball invitational being hosted in downtown Nashville. Trekking there last night with my father-in-law, we didn’t leave until after it’s official time had long past.

Chatting with the locals was also pretty cool. And reminded me how much I missed playing pinball. Now that my kids are getting bigger, I might be finding more time to put them into action. My five-year-old will now power-on Cirqus Voltaire, pull a chair over, and start playing.

They tipped me off about other collectors in my own town, so I have started poking around to rekindle my pinball presence.

STTNG_pinballI traded stories with several of the other collectors. I had forgotten how fun it was to talk shop about my machines and hear about how others had found theirs. It also reminded me how in this day and age, where everything is on the Internet, pinball is one of those things where it’s so much fun to get your hands on it.

I signed my name on the way out, adding myself to a list of people interested in doing more pinball stuff in the future. All in all, it was an awesome time.

 

#LearningSpringBoot Video is on its way!

learning-spring-boot-2nd-edition-mockI just signed a contract to produce a video-based sequel to Learning Spring Boot, a veritable 2nd edition!

This will move a lot faster than its predecessor. Schedule says primary recording should be done sometime in August, so hopefully you can have it in your hands FAST! (I know Packt has quick schedules, but this is even faster).

This won’t be some rehash of the old book. Instead, we’ll cover a lot of fresh ground including:

  • Diving into Spring’s start.spring.io website.
  • Building a rich, fully functional web app using Spring Boot, Spring Data JPA, Spring Security, and more.
  • Using Boot’s latest and greatest tools like: Actuator, DevTools, and CRaSH.

I’ve written three books. Recording video will be a new adventure. If you want to stay tuned for updates, be sure to sign up for my newsletter.

Can’t wait! Hope you’re as excited as me.

REST, SOAP, and CORBA, i.e. how we got here

I keep running into ideas, thoughts, and decisions swirling around REST. So many things keep popping up that make me want to scream, “Just read the history, and you’ll understand it!!!”

wayback-machineSo I thought I might pull out the good ole Wayback Machine to an early part of my career and discuss a little bit about how we all got here.

In the good ole days of CORBA

This is an ironic expression, given computer science can easily trace its roots back to WWII and Alan Turing, which is way before I was born. But let’s step back to somewhere around 1999-2000 when CORBA was all the rage. This is even more ironic, because the CORBA spec goes back to 1991. Let’s just say, this is where I come in.

First of all, do you even know what CORBA is? It is the Common Object Request Broker Architecture. To simplifiy, it was an RPC protocol based on the Proxy pattern. You define a language neutral interface, and CORBA tools compile client and server code into your language of choice.

The gold in all this? Clients and servers could be completely different languages. C++ clients talking to Java servers. Ada clients talking to Python servers. Everything from the interface definition language to the wire protocol was covered. You get the idea.

Up until this point, clients and servers spoke binary protocols bound up in the language. Let us also not forget, that open source wasn’t as prevalent as it is today. Hessian RPC 1.0 came out in 2004. If you’re thinking of Java RMI, too bad. CORBA preceded RMI. Two systems talking to each other were plagued by a lack of open mechanisms and tech agreements. C++ just didn’t talk to Java.

CORBA is a cooking!

advanced-corbaWith the rise of CORBA, things started cooking. I loved it! In fact, I was once known as Captain Corba at my old job, due to being really up to snuff on its ins and outs. In a rare fit of nerd nirvana, I purchased Steve Vinoski’s book Advanced CORBA Programming with C++, and had it autographed by the man himself when he came onsite for a talk.

Having written a mixture of Ada and C++ at the beginning of my career, it was super cool watching another team build a separate subsystem on a different stack. Some parts were legacy Ada code, wrapped with an Ada-Java-CORBA bridge. Fresh systems were built in Java. All systems spoke smoothly.

The cost of CORBA

Boring PresentationThis was nevertheless RPC. Talking to each other required meeting and agreeing on interfaces. Updates to interfaces required updates on both sides. The process to make updates was costly, since it involved multiple people meeting in a room and hammering out these changes.

The high specificity of these interfaces also made the interface brittle. Rolling out a new version required ALL clients upgrade at once. It was an all or nothing proposition.

At the time, I was involved with perhaps half a dozen teams and the actual users was quite small. So the cost wasn’t that huge like today’s web scale problems.

Anybody need a little SOAP?

soapAfter moving off that project, I worked on another system that required integrate remote systems. I rubbed my hands together, ready to my polished CORBA talents to good use again, but our chief software engineer duly informed me a new technology being evaluted: SOAP.

“Huh?”

The thought of chucking all this CORBA talent did not excite me. A couple of factors transpired FAST that allowed SOAP to break onto the scene.

First of all, this was Microsoft’s response to the widely popular CORBA standard. Fight standards with standards, ehh? In that day and age, Microsoft fought valiantly to own any stack, end-to-end (and they aren’t today???? Wow!) It was built up around XML (another new acronym to me). At the time of its emergence, you could argue it was functionally equivalent to CORBA. Define your interface, generate client-side and server-side code, and its off the races, right?

But another issue was brewing in CORBA land. The OMG, the consortium responsible for the CORBA spec, had gaps not covered by the spec. Kind of like trying to ONLY write SQL queries with ANSI SQL. Simply not good enough. To cover these gaps, very vendor had proprietary extensions. The biggest one was Iona, an Irish company that at one time held 80% of the CORBA marketshare. We knew them as “I-own-ya'” given their steep price.

CORBA was supposed to cross vendor supported, but it wasn’t. You bought all middleware from the same vendor. Something clicked, and LOTS of customers dropped Iona. This galvanized the rise of SOAP.

But there was a problem

SOAP took off and CORBA stumbled. To this day, we have enterprise customers avidly using Spring Web Services, our SOAP integration library. I haven’t seen a CORBA client in years. Doesn’t mean CORBA is dead. But SOAP moved into the strong position.

Yet SOAP still had the same fundamental issue: fixed, brittle interfaces that required agreement between all parties. Slight changes required upgrading everyone.

When you build interfaces designed for machines, you usually need a high degree of specification. Precise types, fields, all that. Change one tiny piece of that contract, and clients and servers are no longer talking. Things were highly brittle. But people had to chug along, so they started working around the specs anyway they could.

I worked with a CORBA-based off the shelf ticketing system. It had four versions of its CORBA API to talk to. A clear problem when using pure RPC (CORBA or SOAP).

Cue the rise of the web

Around-the-webWhile “rise of the web” sounds like some fancy Terminator sequel, the rampant increase in the web being the platform of choice for e-commerce, email, and so many other things caught the attention of many including Roy Fielding.

Roy Fielding was a computer scientist that had been involved in more than a dozen RFC specs that governed how the web operated, the biggest arguably being the HTTP spec. He understood how the web worked.

The web had responded to what I like to call brute economy. If literally millions of e-commerce sites were based on the paradigm of brittle RPC interfaces, the web would never have succeeded. Instead, the web was built up on lots of tiny standards: exchanging information and requests via HTTP, formatting data with media types, a strict set of operations known as the HTTP verbs, hypermedia links, and more.

But there was something else in the web that was quite different. Flexibility. By constraining the actual HTML elements and operations that were available, browsers and web servers became points of communication that didn’t require coordination when a website was updated. Moving HTML forms around on a page didn’t break consumers. Changing the text of a button didn’t break anything. If the backend moved, it was fine as long as the link in the page’s HTML button was updated.

The REST of the story

The-Rest-of-the-StoryIn his doctoral dissertation published in 2000, Roy Fielding attempted to take the lessons learned from building a resilient web, and apply them to APIs. He dubbed this Representational Transfer of State or REST.

So far, things like CORBA, SOAP, and other RPC protocols were based on the faulty premise of defining with high precision the bits of data sent over the wire and back. Things that are highly precise are the easiest to break.

REST is based on the idea that you should send data but also information on how to consume the data. And by adopting some basic constraints, clients and servers can work out a lot of details through a more symbiotic set of machine + user interactions.

For example, sending a record for an order is valuable, but it’s even handier to send over related links, like the customer that ordered it, links to the catalog for each item, and links to the delivery tracking system.

Clients don’t have to use all of this extra data, but by providing enough self discovery, clients can adapt without suffering brittle updates.

The format of data can be dictated by media types, something that made it easy for browsers to handle HTML, image files, PDFs, etc. Browsers were coded once, long ago, to render a PDF document inline including a button to optionally save. Done and done. HTML pages are run through a different parser. Image files are similarly rendered without needing more and more upgrades to the browser. With a rich suite of standardized media types, web sites can evolve rapidly without requiring an update to the browser.

Did I mention machine + user interaction? Instead of requiring the client to consume links, it can instead display the links to the end user and let he or she actually click on them. We call this well known technique: hypermedia.

To version or not to version, that is the question!

HamletA question I get anytime I discuss Spring Data REST or Spring HATEOAS is versioning APIs. To quote Roy Fielding, don’t do it! People don’t version websites. Instead, they add new elements, and gradually implement the means to redirect old links to new pages. A better summary can be found in this interview with Roy Fielding on InfoQ.

When working on REST APIs and hypermedia, your probing question should be, “if this was a website viewed by a browser, would I handle it the same way?” If it sounds crazy in that context, then you’re probably going down the wrong path.

Imagine a record that includes both firstName and lastName, but you want to add fullName. Don’t rip out the old fields. Simply add new ones. You might have to implement some conversions and handlers to help older clients not yet using fullName, but that is worth the cost of avoiding brittle changes to existing clients. It reduces the friction.

In the event you need to REALLY make a big change to things, a simple version number doesn’t cut it. On the web, it’s called a new website. So release a new API at a new path and move on.

People clamor HARD for getting the super secret “id” field from a data record instead of using the “self” link. HINT: If you are pasting together URIs to talk to a REST service, something is wrong. It’s either your approach to consuming the API, or the service itself isn’t giving you any/enough links to navigate it.

When you get a URI, THAT is what you put into your web page, so the user can see the control and pick it. Your code doesn’t have to click it. Links are for users.

Fighting REST

fight1To this day, people are still fighting the concept of REST. Some have fallen in love with URIs that look like http://example.com/orders/523/lineitem/14 and http://example.com/orders/124/customer, thinking that these pretty URLs are the be-all/end-all of REST. Yet they code with RPC patterns.

In truth, formatting URLs this way, instead of as http://example.com/orders?q=523&li=14 or http://example.com/orders?q=124&mode=customer is to take advantage of HTTP caching when possible. A Good Idea(tm), but not a core tenet.

As a side effect, handing out JSON records with {orderId: 523} has forced clients to paste together links by hand. These links, not formed by the server, are brittle and just as bad as SOAP and CORBA, violating the whole reason REST was created. Does Amazon hand you the ISBN code for a book and expect you to enter into the “Buy It Now” button? No.

Many JavaScript frameworks have arisen, some quite popular. They claim to have REST support, yet people are coming on to chat channels asking how to get the “id” for a record so they can parse or assemble a URI.

BAD DESIGN SMELL! URIs are built by the server along with application state. If you clone state in the UI, you may end up replicating functionality and hence coupling things you never intended to.

Hopefully, I’ve laid out some the history and reasons that REST is what it is, and why learning what it’s meant to solve can help us all not reinvent the wheel of RPC.

#opensource is not a charity

clock-with-a-questionLogging onto my laptop this morning, I have already seen two tickets opened by different people clamoring for SOMEONE to address their stackoverflow question. They appeared to want an answer to their question NOW. The humor in all this is that the issue itself is only seven hours old, with the person begging for a response when their question is barely three hours old. Sorry, but open source is not a charity.

gordonBatPhoneIf you have a critical issue, perhaps you should think about paying for support. It’s what other customers need when they want a priority channel. It definitely isn’t free as in no-cost. Something that doesn’t work is opening a ticket with nothing more than a link to your question.

question-not-answeredOpen source has swept the world. If you don’t get onboard to using it, you risk being left in the dust. But too many think that open source is free, free, FREE. That is not the case. Open source means you can access the source code. Optimally, you have the ability to tweak, edit, refine, and possibly send back patches. But nowhere in there is no-cost support.

pivotal-ossIn a company committed to open source, we focus on building relationships with various communities. The Spring Framework has grown hand over fist in adoption and driven much of how the Java community builds apps today. Pivotal Cloud Foundry frequently has other companies sending in people to pair with us. It’s a balancing act when trying to coach users to not assume their question will be answered instantly.

helpingI frequent twitter, github, stackoverflow, and other forums to try and interact with the community. If at all possible, I shoot to push something through. Many times, if we’re talking about a one-line change, it’s even easier. But at the end of the day, I have to draw a line and focus on priorities. This can irk some members not aware of everything I’m working on. That is a natural consequence.

Hopefully, as open source continues to grow, we can also mature people’s expectations between paid and un-paid support. Cheers!

P.S. For a little while longer, there is a coupon code to Learning Spring Boot for 50% off (Python Testing Cookbook as well!)

Banner (LSPT50)

Spring Boot is still a gem…waiting to be discovered

devnexusLast week, I had the good fortune of speaking twice at the DevNexus conference, the 2nd largest Java conference in North America. It was awesome! Apart from being a total geek fest with international attendance, it was a great place to get a bigger picture of the state of the Java community.

intro-to-spring-data-devnexus-2016A bunch of people turned up for my Intro to Spring Data where we coded up an employee management system from scratch inside of 45 minutes. You can see about 2/3 of the audience right here.

It was a LOT of fun. It was like pair programming on steroids when people helped me handle typos, etc. It was really fun illustrating how you don’t have to type a single query to get things off the ground.

platform-spring-bootWhat I found interesting was how a couple people paused me to ask questions about Spring Boot! I wasn’t expecting this, so it caught me off guard when asked “how big is the JAR file you’re building?” “How much code do you add to make that work?” “How much code is added to support your embedded container?”

learning-spring-bootSomething I tackled in Learning Spring Boot was showing people the shortest path to get up and running with a meaningful example. I didn’t shoot for contrived examples. What use is that? People often take code samples and use it as the basis for a real system. That’s exactly the audience I wrote for.

People want to write simple apps with simple pages leveraging simple data persistence. That is Spring Boot + Spring Data out of the running gate. Visit http://start.spring.io and get off the ground! (Incidentally, THIS is what my demo was about).

I-heart-spring-AidenI was happy to point out that the JAR file I built contained a handful of libraries along with a little “glue code” to read JAR-within-a-JAR + autoconfiguration stuff. I also clarified that the bulk of the code is actually your application + Tomcat + Hibernate. The size of Boot’s autoconfiguration is nothing compared to all that. Compare that to time and effort to write deployment scripts, maintenance scripts, and whatever else glue you hand write to deploy to an independent container. Spring Boot is a life saver in getting from concept to market.

It was fun to see at least one person in the audience jump to an answer before I could. Many in the audience were already enjoying Spring Boot, but it was fun to see someone else (who by the way came up to ask more questions afterward) discovering the gem of Spring Boot for the first time.

CedricTo see the glint in someone’s eye when they realize Java is actually cool. Well, that’s nothing short of amazing.

LVM + RAID1 = Perfect solution to upgrade woes

As said before, I’m rebuilding an old system and have run into sneaky issues. In trying to upgrade from Ubuntu 12.04 to 14.04, it ran out of disk space at the last minute, and retreated to the old install. Unfortunately, this broke its ability to boot.

Digging in, it looks like Grub 2 (the newer version) can’t install itself properly due to insufficient space at the beginning of the partition. Booting up from the other disk drives (from a different computer), I have dug in to solve the problem.

How do you repartition a disk drive used to build a RAID 1 array, that itself is hosting a Linux Volume Group?

It’s not as hard as you think!

raid1A mirror RAID array means you ALWAYS have double the disk space needed. So…I failed half of my RAID array and removed one of the drives from the assembly. Then I wiped the partition table and built a new one…starting at later cylinder.

POOF! More disk space instantly available at the beginning of the disk for GRUB2.

Now what?!? Creating a new RAID array partition in degraded mode, I add it the LVM volume group as a new physical volume.

lvmThen I launch LVM’s handy pvmove command, which moves everything off the old RAID array and onto the new one.

Several hours later, I can reduce the volume group and remove the older RAID array. With everything moved onto the newly resize partition, I can then destroy and rebuild the old disk with the same geometry as the new one, pair it up, and BAM! RAID array back in action, and let it sync back up.

This should line things up to do a GRUB2 chroot installation.

With LVM it’s easy to shrink and expand partitions, reclaim some spare space, and move stuff around. But you are nicely decouple from the physical drives.

With RAID1, you have high reliability by mirroring. And as a side effect, you always have a space disk on hand if you need to move data around. I once moved live MythTV video data off the system to reformat my video partition into xfs.

Out with old and in with the new

startrek-ubuntu-bootupI have been waiting a long time to resurrect an old friend of mine: my MythTV box. I built that machine ten years ago. (I’d show you the specs, but they’re locked away ON the machine in an antique mediawiki web server). It runs Root-on-LVM-on-Raid top to bottom (which, BTW, requires LILO).

It was great project to build my own homebrew DVR. But with the advent of digital cable and EVERYTHING getting scrambled, those days are gone. So it’s sat in the corner for four years. FOUR YEARS. I’m reminded of this through periodic reports from CrashPlan.

I started to get this idea in my head that I could rebuild it with Mac OSX and make it a seamless backup server. Until I learned it was too hold to not support OSX. So it continued to sit, until I learned that I could install the right bits for it to speak “Apple”, and hence become a Time Machine capsule.

So here we go! I discovered that I needed a new VGA cable and power cord to hook up the monitor. After that came in, I booted it up…and almost cried. Almost.

As I logged in, I uncovered neat stuff and some old commands I hadn’t typed in years. But suffice it to say, it is now beginning its first distro upgrade (probably more to come after that), and when done, I’ll migrate it off of being a Mythbuntu distro and instead pick mainline Ubuntu (based on GNOME).

One that is done, I hope to install docker so I spin up services needed (like netatalk) much faster and get ahold of its ability to provide an additional layer of home support for both my and my wife’s Mac laptops.

Book Report: Area 51 by Bob Mayer

As indicated before, I started reading break away or debut novels by prominent authors last year. And here I am to deliver another book report!

Area 51 – Bob Mayer

Bob Mayer was one of the speakers at last year’s Clarksville Writer’s Conference. He was hilarious, gung ho, maybe a tad bombastic (retire Green Beret), and best selling author that had no hesitation to brag he makes about $1000/day with his trove of published novels.

Like or hate his personality, he has succeeded so I wanted to read one of his first works. It turns out, this novel was released under the pen name “Robert Doherty” through classic channels. He has since gotten the IP rights for all these past novels reverted back to him, a business move worthy of respect, and moved on to e-books.

Back to the story. It really is pretty neat. The writing is crisp, the dialog cool. I kept turning page after page, wanting to know what happens. I also had an inbuilt curiosity as to what this author would do. I have seen TV shows set in Area 51 like Seven Days, Stargate: SG-1 (based near Area 51 and steeped in similar military conspiracy), and other movies.

There was a bit of investigative journalism gone wrong combined with other historical legends. I must admit that part (won’t give it away!) really whet my appetite.

Bob Mayer indeed knows how to write. He knows how to make you turn the pages. I think I spent 3-4 days tops reading this book. I’ll confess it didn’t match my hunger in reading the debut Jack Reacher novel KILLING FLOOR. But then again, I’m finding it hard to spot the next novel that will compete on that level.

I’ll shoot with you straight on this: it wasn’t as hard to move to another novel by another author when I finished as it was for certain other novels. There were other series novels I read last year that made it hard to stop and move on instead of continuing the series. This one wasn’t the same. Will I ever go back and read more of Bob Mayer’s books?

Maybe/maybe no. I have read some of his other non-fiction books on writing craft, so in a sense, the man has already scored additional sales. It takes a top notch story with top notch characters and top notch writing to score that with me, and Jack Reacher has made me picky. Don’t take it a nock.

If you like SciFi and military conspiracies, you’ll find this book most entertaining.

Happy reading!