Ben's Bay

Welcome!

For whatever reason you're here now. What you see is interpreted by your browser and rendered by your machine; altered by the technology and finally by your perception.

Here's a person, it's me. :)

There used to be more here but the Internet isn't what it was back in 1999, so I have deleted pretty much everything.

Movator Me

This website is a tiny window, and outdated. But I've been here since before blogging was even a word.

LinkedIn Profile.

Projects (some of them, anyway)

Notable Links

Golang

Here is my solution to the "A Tour of Go Exercise: Web Crawler" problem.

package main

import (
    "fmt"
    "time"
)

type Fetcher interface {
    // Fetch returns the body of URL and
    // a slice of URLs found on that page.
    Fetch(url string) (body string, urls []string, err error)
}

// Crawl uses fetcher to recursively crawl
// pages starting with url, to a maximum of depth.
func Crawl(url string, depth int, urlchan chan CrawlUrl, crawlsem chan int, fetcher Fetcher) {
    defer func() { <- crawlsem }() // pull our start fact off the semaphore
    if depth <= 0 {
        return
    }
    body, urls, err := fetcher.Fetch(url)
    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Printf("found: %s %q\n", url, body)
    for _, u := range urls {
        urlchan <- CrawlUrl{u, depth}
    }
    return
}

type CrawlUrl struct {
    url string
    depth int
}

func Crawler(url string, depth int, fetcher Fetcher) {
    fmt.Printf("Crawling: %s\n", url);
    urlchan := make(chan CrawlUrl)
    crawlsem := make(chan int, 5)
    donelist := make(map [string]bool)
    crawlsem <- 0
    go Crawl(url, depth, urlchan, crawlsem, fetcher)
    for {
        select {
        case u := <- urlchan:
            if _,found := donelist[u.url]; found == false {
                donelist[u.url] = true
                crawlsem <- 0
                go Crawl(u.url, u.depth-1, urlchan, crawlsem, fetcher)
            }
        case sem := <- crawlsem:
            // crawlers are still active
            crawlsem <- sem
            time.Sleep(time.Millisecond) // yield
        default:
            // No crawlers are active, and no urls left
            return
        }
    }
}

func main() {
    Crawler("http://golang.org/", 4, fetcher)
}

Java

I'm pleased to have discovered how a Java program can modify its own environment variables, so that programs or libraries subsequently loaded can use them. Here it is:

import com.sun.jna.Library;
import com.sun.jna.Native;
import com.sun.jna.Platform;

/** This class sets environment variables, using libc via JNA.  
 * It's actually quite a simple use of the powerful and promising
 * JNA library, which Sun produced.
 */
public class SetEnv {

    // This is the standard, stable way of mapping, which supports extensive
    // customization and mapping of Java to native types.
    public interface CLibrary extends Library {
        CLibrary INSTANCE = (CLibrary)
            Native.loadLibrary((Platform.isWindows() ? "msvcrt" : "c"),
                               CLibrary.class);
    
        int setenv(String name, String value, int overwrite);
        String getenv(String name);
    }

    public static void main(String[] args) {
		String var = "VARNAME";
		String val = CLibrary.INSTANCE.getenv(var);
        System.out.printf("%s\t%s\n", var, val);
        CLibrary.INSTANCE.setenv(var, "PKCS11_LIB", 1);
		val = CLibrary.INSTANCE.getenv(var);
        System.out.printf("%s\t%s\n", var, val);
    }
}

Fin

Tact involves skill and patience; patience involves love.